#### Abstract

A two-parameter generalization of Boltzmann-Gibbs-Shannon entropy based on natural logarithm is introduced. The generalization of the Shannon-Khinchin axioms corresponding to the two-parameter entropy is proposed and verified. We present the relative entropy, Jensen-Shannon divergence measure and check their properties. The Fisher information measure, the relative Fisher information, and the Jensen-Fisher information corresponding to this entropy are also derived. Also the Lesche stability and the thermodynamic stability conditions are verified. We propose a generalization of a complexity measure and apply it to a two-level system and a system obeying exponential distribution. Using different distance measures we define the statistical complexity and analyze it for two-level and five-level system.

#### 1. Introduction

Entropy is a very important quantity and plays a key role in many aspects of statistical mechanics and information theory. The most widely used form of entropy was given by Boltzmann and Gibbs from the statistical mechanics point of view and by Shannon from information theory point of view. Later certain other generalized measures of entropy like the Rényi entropy [1] and the Sharma-Mittal-Taneja entropy [2, 3] were introduced and their information theoretic aspects were investigated. Recently in [4] a new expression for the entropy was proposed as a generalization of the Boltzmann-Gibbs entropy and the necessary properties like concavity, Lesche stability, and thermodynamic stability were verified. This entropy has been applied to a wide variety of physical systems, in particular to long-range interacting systems [5, 6] and non-Markovian systems [7].

Most of the generalized entropies introduced so far were constructed using a deformed logarithm. But two generalized entropies, one called the fractal entropy and the other known as fractional entropy, were proposed using the natural logarithm. The fractal entropy which was introduced in [8] attempts to describe complex systems which exhibit fractal or chaotic phase space. Similarly, the fractional entropy was put forward in [9] and later applied to study anomalous diffusion [10]. Merging these two entropies in our present work we propose a fractional entropy in a fractal phase space. Thus, there are two parameters, one characterizing the fractional nature of the entropy and the other describing the fractal dimension of the phase space. Thus, the functional form of the entropy depends on the natural logarithm. We give the generalized Shannon-Khinchin axioms corresponding to this two-parameter entropy and prove that they uniquely characterize our entropy. The two-parameter relative entropy and the Jensen-Shannon divergence measure are also generated. The generalized Fisher information is derived from the relative entropy. Relative Fisher information measure and its associated Jensen-Fisher information measure corresponding to this entropy are also proposed. The thermodynamic properties like the Lesche stability and the thermodynamic stability are also verified. Using our generalized entropy, we set up a two-parameter generalization of the well-known LMC (López-Ruiz, Mancini, and Calbet) complexity measure [11] and apply it to a two-level system and an exponential distribution. We consider different measures like the Euclidean distance and the Wootters’ distance for the disequilibrium. Complexity measures are constructed for these disequilibrium distances. Using two-level and five-level system, these complexity measures are analyzed.

After the introduction in Section 1, we introduce our new two-parameter entropy in Section 2 and investigate its properties. In Section 3 the relative entropy and the Jensen-Shannon divergence measure corresponding to this two-parameter entropy are presented and its properties are studied. Using the relative entropy, the Fisher information measure, the relative Fisher information, and the Jensen-Fisher information are also obtained. The thermodynamic properties are analyzed in Section 4. In Section 5, we present a two-parameter generalization of the LMC complexity measure and analyze the complexity of a two-level system and a system with continuous probability distribution. Subsequently we analyze a two- and five-level system using LMC complexity measure based on the Euclidean and the Wootters’ distance. We conclude in Section 6.

#### 2. Generalized Entropy and Its Axiomatic Characterization

The Boltzmann-Gibbs-Shannon entropy which is an expectation value of is generally expressed as where represents the probability and is a constant. A new form of entropy based on the natural logarithm was proposed in [8] considering the effective probability, that is, , to take into account incomplete information. The entropy thus defined makes use of the -expectation given as follows: The -expectation value (3) characterizes incomplete normalization [8] which is known to occur in complex systems. Later to account for the mixing which occurs due to interactions between the various states of the system, the same form of the entropy but with the regular conditions on the probabilities, that is, , was discussed in [12, 13].

The Boltzmann-Gibbs-Shannon entropy can also be defined through the equation Replacing the ordinary derivative by the Weyl fractional derivative, a new entropy was obtained by Ubriaco in [9]. The functional form of the entropy which is an expectation value of reads as follows: A salient feature of the fractal entropy (2) and the fractional entropy (5) is that they are functions of the ordinary logarithm unlike the other generalized entropies [4, 14, 15] which are defined through the use of deformed logarithms.

Inspired by the fractal entropy (2) and the fractional entropy (5), we propose a two-parameter generalization of the Boltzmann-Gibbs-Shannon entropy where is a generalization of the Boltzmann constant and and are the parameters which are used to generalize the BGS entropy. The entropy is valid in the regime and . The probabilities obey the condition . Entropy (6) can be considered a fractional entropy in a fractal phase space in which the parameter comes from the fractal nature and the parameter is from the fractional aspect.

In the limit, entropy (6) reduces to the fractional entropy (5). Similarly, we recover the fractal entropy (2) in the limit and the BGS entropy when both the parameters attain the value of unity. A very interesting limiting case of (6) occurs when we set : where is the single-particle Boltzmann entropy. The one-parameter entropy (7) is the sum of biased single-particle Boltzmann entropy. At this juncture, we would like to make a remark about the Boltzmann entropy and the Gibbs entropy. An explanation in [16] states that the Boltzmann entropy is the sum of the entropy calculated from the one-particle distribution, whereas the Gibbs entropy is computed directly from the -particle distribution. This implies that the Boltzmann entropy and the Gibbs entropy are the same only when the systems are noninteracting. Looking into (7) from this point of view, we realize that this entropy can be understood in a similar setting; that is, the one-parameter entropies are biased by a parameter and this bias may be due to the presence of interactions. Such behaviour strongly resembles the characteristics of complex systems in which the behaviour of the total system is different from the single-particle system due to presence of interactions. So, we assume that entropy (7) described above may be a strong candidate in describing complex systems.

Below, we present the two-parameter generalization of the Shannon-Khinchin axioms. Let be an -dimensional simplex as defined below: in which the two-parameter entropy (6) satisfies the following axioms.

*(i) Continuity*. The entropy is continuous in .

*(ii) Maximality*. For any and any ,
*(iii) Expansibility*. Consider
*(iv) Generalized Shannon Additivity*. Consider
where
The factor appearing in (11) is the conditional probability, that is, probability of occurrence of a th event when a particular th event has occurred.

##### 2.1. Concavity

The two-parameter entropic functional (6) has an extremum at . The second derivative with respect to is For the function to be strictly concave equation (13) should be less than zero and the range of and is determined by this condition. The second derivative with respect to at the point of extremum is From (14), it can be observed that the second derivative of the entropy at the extremum is uniformly −ve in the region , implying that the entropic functional at this point is uniformly concave for . We illustrate the concavity of the entropic function through a set of plots shown in Figure 1.

**(a)**

**(b)**

**(c)**

**(d)**

##### 2.2. Generalized Form of Shannon Additivity

Let be the probability of occurrence of a joint event, in which and are probability of occurrence of the individual events. The two-parameter entropy corresponding to the joint probability can be written as Expanding the logarithm in the above equation using the binomial theorem and isolating the term, we arrive at where denotes the binomial coefficients. Since the system is symmetric in and , the binomial expansion of (15) can be written in an equivalent form as Adding (16) and (17), the modified form of Shannon additivity is obtained: When the two events are independent, that is, the joint probability obeys the relation , (18) simplifies into where is the mixing between the various states and occurs due to the fractal nature of the phase space. The Shannon additivity relations corresponding to the various limiting cases of our two-parameter entropy (6) are listed below for the sake of completeness.

*Special Cases Corresponding to the Various One-Parameter Entropies*. (i) In the limit, the generalized Shannon additivity corresponding to the fractional entropy (5) is obtained:
When the joint probability of system obeys the relation , we recover the pseudoadditivity relation proved in [9].

(ii) The generalized Shannon additivity corresponding to the fractal entropy (2) is recovered in the limit and the pseudoadditivity relation corresponding to this entropy given in [12] can be obtained under the condition .

(iii) The generalized Shannon additivity relation corresponding to the one-parameter entropy (7) in the limit has the form Imposing the condition , we can get the following pseudoadditivity relation:

##### 2.3. Uniqueness of the Two-Parameter Entropy

In this subsection, we prove the uniqueness of the two-parameter entropy which obeys the modified form of the Shannon additivity given in (18). It can be noticed that (18) is a symmetrized combination of the following two equations:
We come to this conclusion, since in (18) there is symmetry between the first and the second term and also between the two terms within the summation. Since the* lhs* in (24) and (25) are equal, the* rhs* of these equations should also have matching individual terms. This implies that and also that the entropic function can be separated in the form of . Since we already know that , what remains is to find the functional form of . Using the separable form of the entropy and the structure of in (24), we arrive at
This can be rewritten in the following form:
Comparing the coefficients of , we get
The only function which satisfies the form shown above is the logarithm. The entropy is a positive function, whereas the probabilities can take only the values , so this leads to the conclusion that . Combining the two parts of the entropy and , we get (6), the form of the entropy.

Finally, for the sake of completeness, we define the conditional entropy of a pair of discrete random variables with a joint distribution as where denotes the conditional probability.

#### 3. Generalized Divergence Measures

Divergence measures play an important role in information theoretic analysis of any entropy, since the probability distribution of a random variable cannot always be found exactly and also due to the reason that sometimes it is necessary to find the difference between two distributions. For the Shannon entropy, several such measures like the Kullback-Liebler relative entropy, Jensen-Shannon divergence, and so forth have been introduced and investigated in detail. In this section, we define these measures for the two-parameter entropy proposed in the previous section.

##### 3.1. Relative Entropy

If and are any two probability distributions, the two-parameter relative entropy corresponding to these distributions is defined as where and are the generalizing parameters. In the limit , we recover the fractional relative entropy and in the limit the fractal relative entropy can be obtained. In the case where , we obtain the Kullback relative entropy corresponding to the one-parameter entropy described in (7). When both the parameters are set to unity, we get back the Kullback relative entropy.

Below, we list the properties of the two-parameter entropy and prove them.

*(i) Nonnegativity*. .

*(ii) Continuity*. is a continuous function for the variables.

*(iii) Symmetry*. The relative entropy is symmetric under the simultaneous exchange of a pair of variables in the distributions and :
*(iv) Possibility of Extension*. Consider
*(v) Pseudoadditivity*. Consider
where and .

*(vi) Joint **-Convexity*. Consider

*Proof. *Entropy (6) is uniformly concave for positive and and this proves the first axiom. Axioms (ii), (iii), and (iv) can be trivially proved. The expression for pseudoadditivity in (v) follows from direct calculation. Entropy (6) is uniformly concave in the regime , , and this ensures that the relative entropy satisfies the joint -convexity stated in axiom (vi) proposed in [17]. To prove the joint -convexity, we use the generalized form of the log-sum inequality
This inequality can be obtained from -generalization of Jensen inequality proposed in [17].

Since the relative entropy (30) is not symmetric in and , we define the following symmetric measure: which shares most of the properties of the relative entropy. It can be noticed that the Kullback-Liebler relative entropy (30) is undefined if the distributions and . Similarly, the symmetric form of the relative entropy (36) is undefined if any of the distributions vanishes. This implies that the distribution has to be continuous with respect to the distribution for measure (30) to be defined. For the case of the symmetric measure, the distributions and have to be continuous with respect to each other.

To overcome this modified form, the relative entropy is defined between the distribution and a distribution which is a symmetric sum of both and . The mathematical expression corresponding to the modified relative entropy is The alternative form of relative entropy proposed in the above equation is defined even when is not absolutely continuous with respect to the distribution . Though it satisfies all the properties of the Kullback-Liebler relative entropy, it is not a symmetric measure. So, a symmetric measure based on (37) is defined as follows: which is a generalization of the Jensen-Shannon divergence measure [18] corresponding to the two-parameter entropy (6) defined in the previous section.

The symmetric measure based on the relative entropy (36) and the generalized Jensen-Shannon divergence measure (38) satisfies the following properties: where can be either or . The symmetric form of the relative entropy and the Jensen-Shannon divergence measure are related via the expression which clearly shows that the upper bound to the Jensen-Shannon divergence is given by the symmetric form of the relative entropy. A similar relationship also exists between (37) and (30) in which the relative entropy defines the upper bound of the modified relative entropy.

##### 3.2. Fisher Information Measure

The Fisher information measure for a continuous random variable with probability distribution is defined as In [19], the Fisher information was obtained from the Kullback-Liebler relative entropy in the following manner: the relative entropy between a uniform probability distribution and its shifted measure is constructed. The integrand is then expanded as a Taylor series in the shift up to second order from which the Fisher information measure is recognized. Analogously, we proceed to derive the -generalized Fisher information measure using the two-parameter generalized relative entropy proposed in the previous subsection. The two-parameter relative entropy between the measure and its shifted measure is In the above equation, the function is expanded up to second order in and the resulting expression is written in terms of a binomial series as follows: Considering the first two lower order terms in and in comparison with the method adopted in [19], we obtain the two-parameter generalization of the Fisher information measure Along the lines of (41), the above expression can be defined through a -expectation value as The Fisher information measure (46) makes use of the -expectation and in the limit it reduces to the expression obtained in [20].

##### 3.3. Relative Fisher Information and Jensen-Fisher Divergence Measure

The relative Fisher information measure between two probability distributions and is given by The above relation is not symmetric and so we define which is a symmetric extension of (47). The disadvantage with these measures is the following. Equation (47) requires the distribution to be a continuous function of for a given and relation (48) requires the distributions and to be continuous with respect to each other.

To surmount this the relative Fisher information between two distributions and is the distance between the distribution and their midpoint . The mathematical expression for the relative Fisher information reads: A two-parameter generalization of the recently proposed [21] Jensen-Fisher divergence measure can be constructed via a symmetric combination of the modified form of the relative Fisher information. The form of the generalized Jensen-Fisher divergence measure so constructed is The Jensen-Fisher measure (50) is convex and symmetric and vanishes only when both the probabilities and are identical everywhere.

Finally, we discuss the relevant limiting cases. All the information measures defined in this section reduce to the corresponding measures obtained through the use of the Shannon entropy in the . In the limit, the information measures relevant to the Fractional entropy are recovered. The limit leads to the information measures of the fractal entropy.

#### 4. Thermodynamic Properties

The canonical probability distribution can be obtained by optimizing the entropy subject to the norm constraint and the energy constraint. Adopting a similar procedure for our two-parameter entropy (6), we construct the functional where and are Lagrange’s multiplier, is the energy eigenvalue, and is the internal energy. Employing the variational procedure, we optimize the functional in (51) with respect to the probability to get where is When the functional attains a maximum, its variation with respect to is zero and using this in (52) yields the inverse of the probability distribution Inversion of relation (54) to obtain the probability distribution is not analytically feasible, so we adopt a different method to derive the distribution. Since we have already set to zero, we can integrate (52) to get where is the constant of integration. Substituting the entropic expression (6) in (55), we should be able to solve for . We do not have a closed form solution for for arbitrary and . But one should be able to obtain the ’s for given and either analytically or numerically. The specific values of and should come from real physical systems. Here, we are only studying the general behaviour of the two-parameter entropy for an arbitrary distribution. For the sake of illustration, we provide plots of (54) in Figure 2 using numerical methods assuming the energy to be a quadratic function of a variable . In the first set of plots, we keep fixed at and and vary the values of . Variation of allowing fixed at and is given in the second set of plots. From the graphs, we can observe that we obtain a two-parameter generalization of the Gaussian distribution. The Boltzmann-Gibbs distribution is included in all the four plots for comparative purpose.

**(a)**

**(b)**

**(c)**

**(d)**

##### 4.1. Lesche Stability

A stability criterion was proposed by Lesche [22, 23] to study the stabilities of Rényi and the Boltzmann-Gibbs entropy. The motivation for this criterion goes as follows. An infinitesimal change in the probabilities should produce equally infinitesimal changes in the observable. If and are two probability distributions, Lesche stability requires that and we can find a such that
Using (56), a simple condition was derived in [24] for any generalized entropy maximized by a probability distribution. This condition which is widely used to check the Lesche stabilities of generalized entropies reads as follows:
where the constant is
The function is the inverse probability distribution obtained in (54). In order to compute the constant, we integrate the inverse probability distribution with respect to the probability
Using the transformation , we can see that the integral in* rhs* of (59) is zero. Similarly, it can also be noticed that the due to occurrence of the natural logarithm. So, we finally get the value of as
which leads to the conclusion that for our case and so the criterion for Lesche stability is satisfied.

##### 4.2. Thermodynamic Stability

The thermodynamic stability conditions of the Boltzmann-Gibbs entropy can be derived from the maximum entropy principle and its corresponding additivity relation. In [25, 26], it has been shown that concavity alone does not guarantee thermodynamic stability for the two-parameter entropy (6). We derive the stability conditions for the two-parameter entropy* á la* the method developed in [26]. The pseudoadditivity relation for the two-parameter entropy reads as follows:
Considering an isolated system comprised of two identical subsystems of energy in equilibrium, the total entropy of the system would be . Allowing for an exchange of energy from one subsystem to the other subsystem, the total entropy changes as , whose pseudoadditivity relation following (61) is
Similarly, the pseudoadditivity relation corresponding to obtained using (61) is
From the maximum entropy principle, we know that
Expanding (62) up to second order in and using the maximum entropy principle (64), we get the following condition:
The two-parameter entropy can be connected to its first and second derivatives via recurrence relations substituted in (65) to yield the simplified form
From (66), we notice that, in the* rhs*, the first term is negative in the region due to the concavity conditions imposed on the entropy. From the rest of the terms, we notice that the stability conditions will be respected when either and or and . Under the limiting conditions of , we recover the concavity condition for the Boltzmann entropy which is also the thermodynamic stability condition for the Boltzmann-Gibbs entropy.

##### 4.3. Generic Example in the Microcanonical Ensemble

An isolated system in thermodynamic equilibrium can be described via the microcanonical ensemble. In a microcanonical picture, all the microstates are equally probable. Under conditions of equiprobability, that is, , the two-parameter entropy (6) becomes where is the total number of microstates. In the limit , expression (67) reduces to the microcanonical entropy derived from the fractal entropy (2). Similarly, in the limit, we can obtain the entropic expression corresponding to the fractional entropy (5). When we set , we can get the microcanonical entropy corresponding to the entropy in (7). For entropy (67), the temperature is defined through the relation The definition of temperature corresponding to a generic class of systems for which the density of states is related to the energy via the expression is found to be An analytic inversion of (69) to obtain the energy as a function of temperature is not feasible. Though the above illustration is given only for the microcanonical ensemble, a direct extension of this method to include other kinds of adiabatic ensembles can be easily achieved.

#### 5. Complexity Measures

Physical systems in which the behaviour of the total system cannot be constructed from the properties of the individual components are generally defined as complex systems. Several measures were proposed to quantify complexity of physical systems [11, 27, 28]. One such measure is the disequilibrium based statistical measure of complexity popularly referred to as LMC (López-Ruiz, Mancini, and Calbet) complexity measure which was introduced in [11]. This measure is based on the logic that there are two extreme situations in which we can find the simple systems, one is the perfect crystal in which the constituent atoms are symmetrically arranged and the other limit is the completely disordered system which is best characterized by an ideal gas in which the system can be found in any of the accessible states with the same probability. The available information is very little in the case of a perfect crystal and is maximum for the ideal gas. The amount of information in the system can be found using the Boltzmann-Gibbs entropy . A new quantity called the disequilibrium was proposed which is the distance from the equiprobable distribution and is maximum in the case of the crystal and zero for an ideal gas. The product of these quantities was defined as the measure of complexity. This measure vanishes for both perfect crystal and the ideal gas.

For a system consisting of accessible states with a set of probabilities , obeying the normalization condition , the complexity measure reads as follows: The LMC measure of complexity was found to be a nonextensive quantity. A generalized measure of complexity was proposed in [29] based on Tsallis entropy with a view to absorb the nonadditive features of the entropy. Similarly, a statistical measure of complexity corresponding to the two-parameter entropy (6) is defined as follows: In the limit , we recover the LMC complexity measure proposed in [11]. The LMC complexity measure corresponding to the fractal entropy and the fractional entropy is obtained in the and limits. When we let , the complexity measure corresponding to entropy (7) is recovered.

As an example, let us consider a two-level system with probabilities and . The expression for the entropy and the disequilibrium measure are as follows: The statistical complexity computed from these quantities, is plotted in Figure 3 for the sake of analysis.

**(a)**

**(b)**

From the plots, we notice that the complexity measured using the Boltzmann-Gibbs entropy, fractal entropy, fractional entropy, and the two-parameter entropy for the perfectly ordered state (crystal) goes to zero uniformly. The largest complexity is achieved for the two-parameter entropy followed by the fractal entropy, fractional entropy, and the Boltzmann-Gibbs entropy. Also, the entropic value for which the maximum complexity is reached differs for different entropies. The complexity zero corresponding to the disordered state occurs at various values of the entropy for the different entropies and the BG entropy reaches zero first, followed by the fractional, the fractal, and the two-parameter entropies. For the two-parameter entropy, the largest complexity is achieved more quickly by varying the fractal index rather than the fractional index . Also, the zero complexity state corresponding to the disordered system is attained more quickly when the fractional index is greater than one.

This measure can be extended to continuous probability distribution obeying the normalization condition . For these distributions, the summation over the states in the entropic definition is replaced by an integration over . Similarly, the disequilibrium measure is . Since there is a continuum of states, the number of states is very large and so the disequilibrium measure becomes . The two-parameter generalization of the LMC complexity measure for the continuum case is For the purpose of illustration, we consider the exponential distribution and calculate the corresponding two-parameter entropy and its disequilibrium measure The LMC complexity found by substituting (75) in (74) reads as follows: where is the incomplete gamma function in which the lower limit is replaced by a positive number. In the limit , statistical complexity reduces to .

In order to scale the complexity to lie in the interval , a modified definition was proposed in [30]. The statistical measure of complexity featuring the two-parameter entropy (6) can also be expressed in a similar manner: where is the normalized disequilibrium measure based on a particular distance measure . The various choices of the disequilibrium based on the different distance measures are as given below.

##### 5.1. Disequilibrium Measure Based on Euclidean Distance

The disequilibrium is the distance between the probability distribution and the equiprobable distribution. For the Euclidean norm , the corresponding normalized disequilibrium measure reads as follows: where is the corresponding normalization constant and is the number of accessible states. For a two-level system with probabilities and , the statistical measure of complexity based on the Euclidean distance is The Euclidean distance used above does not take into account the stochastic nature of the probabilities. To overcome this, the statistical distance measure introduced by Wootters’ was used to redefine the disequilibrium.

##### 5.2. Disequilibrium Measure Based on Wootters’ Distance

A novel distance measure called the statistical measure was proposed by Wootters’ [31] in which the distance between two quantities is determined by the size of statistical fluctuations in a measurement. Given that two probabilities are indistinguishable if the difference between them is smaller than the size of the typical fluctuation, the statistical distance is the shortest curve connecting two points in the probability space. This shortest distance is equal to the angle between the two probability vectors and is given by This measure vanishes when the two probability distributions coincide with each other and attain the maximum value when each outcome has a positive probability according to one distribution and has zero probability according to the other distribution. In [30], the disequilibrium measure was defined using Wootters’ distance for the extensive Boltzmann-Gibbs entropy and in our work we adopt this procedure for our entropy (6). The disequilibrium defined using a normalized version of the statistical distance (80) reads as follows: where is the number of accessible states and is the corresponding normalization constant. The statistical measure of complexity of a two-level system with probabilities and computed using this disequilibrium distance is For a two-level system, the LMC complexity measure, based on the two different definitions of distance (79) and (82), is compared for certain values of and in Figure 4. The complexity peaks at different values of for the different disequilibrium distance measures and . The univocal dependence of the complexity on is lost when the number of levels is increased beyond two. This is due to the fact that there are many possible distributions which can give the same value of but different values of the disequilibrium distance and hence different values for the statistical complexity. In Figure 5, we present a set of complexity versus plots of a five-level system for certain typical values of the deformation parameters. From the plots, we notice that when the disequilibrium measure is based on Euclidean distance, the upper bound to the complexity is continuous and presents an envelope over the remaining values. In the case of Wootters’ distance based complexity, the lower bound is continuous. At the specific value of where the complexity value peaks, we define the difference between the maximum and minimum values of complexity as and observe its variations for different values of the deformation parameter. The value of corresponding to is maximum (minimum) compared with when (). These differences are primarily due to the manner in which the distance between the distribution of the system and the equiprobable state is measured. Unlike the Euclidean distance which directly measures the distance between the two points, Wootters’ distance uses the idea of statistical distance in which the distance between two states is proportional to the number of distinguishable intermediate points [31]. The LMC complexity basically measures the statistical complexity, in which it measures the complex dynamics of simpler statistical systems which are not in equilibrium. The measure is a product of the information and the disequilibrium distance . In (79) where we use the disequilibrium distance , the statistical details of the system appear only in the information , since the Euclidean distance ignores the stochastic nature of the probabilities. But when we use the definition of complexity (82), both and incorporate the statistical effects and hence we consider this definition of complexity to be more comprehensive and the results so obtained more appropriate.

**(a)**

**(b)**

#### 6. Conclusions

A new two-parameter entropy based on the natural logarithm and generalizing both the fractal entropy and the fractional entropy is introduced. This encompasses an interesting limiting case, where the -particle entropy can be expressed in terms of a sum of single-particle biased Boltzmann entropies. The generalized forms of the Shannon-Khinchin axioms are proposed and verified for this new two-parameter entropy. These axioms uniquely characterize our new entropy. The corresponding Kullback-Liebler relative entropy is proposed and its properties are investigated. Utilizing the relative entropy, a generalization of the Jensen-Shannon divergence is also achieved. Exploiting the relative entropy between a probability measure and its shift, we derive the generalized Fisher information. Also, we obtain generalized forms of the relative Fisher information and the Jensen-Fisher information. The Lesche stability and the thermodynamic stability are verified for our entropy. We introduce a generalization of the LMC complexity measure making use of our two-parameter entropy and apply it to measure the complexity of a two-level system. The results obtained indicate that there is a change in the complexity value with the dominant contribution coming from the fractal index . As an example of a continuous probability distribution, we consider the exponential distribution and study its complexity. Using a normalized form of the entropy and distance measure, we analyze the complexity of a given system. Two different distance measures, one being the normalized form of the Euclidean distance and the other normalized Wootters’ distance, are used to define the complexity. Using these definitions, the complexity of two-level and five-level system is thoroughly analyzed. We conclude that using the Wootters’ distance for the disequilibrium measure is more appropriate since it captures the stochastic effects of the probability distribution.

From our investigations we assume that this entropy will be of use in measuring complexity in fractal systems and systems which exhibit fractional dynamics in phase space. Towards this end, investigating the complexity of probability distributions corresponding to the fractional diffusion equation [32] will be worth pursuing. Also extending the idea of using Jensen-Shannon divergence as a disequilibrium measure [33], to measure statistical complexities, will be an interesting pursuit but is quite outside the purview of our current work.

#### Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

#### Acknowledgments

One of the authors Ravikumar C. would like to acknowledge the use of library facilities at the Institute of Mathematical Sciences (IMSc) and he would like to thank Mr. Chang-Hau Kuo, National Tsing Hua University, for his helpful discussions and advice in matters of numerical calculations. Chandrashekar R. was financially supported through the MOST Grants 102-2811-M-005-025 and 102-2628-M-005-001-MY4 in Taiwan.