Mathematical Problems in Engineering

Volume 2018, Article ID 1791954, 16 pages

https://doi.org/10.1155/2018/1791954

## The Unifying Frameworks of Information Measures

Correspondence should be addressed to Ting-Zhu Huang; moc.621@gnauhuhzgnit

Received 23 July 2017; Accepted 6 February 2018; Published 8 March 2018

Academic Editor: Zhen-Lai Han

Copyright © 2018 Shiwei Yu and Ting-Zhu Huang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Information measures are capable of providing us with fundamental methodologies to analyze uncertainty and unveiling the substantive characteristics of random variables. In this paper, we address the issues of different types of entropies through -generalized Kolmogorov-Nagumo averages, which lead to the propositions of the survival Rényi entropy and survival Tsallis entropy. Therefore, we make an inventory of eight types of entropies and then classify them into two categories: the density entropy that is defined on density functions and survival entropy that is defined on survival functions. This study demonstrates that, for each type of the density entropy, there exists a kind of the survival entropy corresponding to it. Furthermore, the similarity measures and normalized similarity measures are, respectively, proposed for each type of entropies. Generally, functionals of different types of information-theoretic metrics are equally diverse, while, simultaneously, they also exhibit some unifying features in all their manifestations. We present the unifying frameworks for entropies, similarity measures, and normalized similarity measures, which helps us deal with the available information measures as a whole and move from one functional to another in harmony with various applications.

#### 1. Introduction

Measures of probabilistic uncertainty and information have attracted growing attentions since Hartley introduced the practical measure of information as the logarithm of the amount of uncertainty associated with finite possible symbol sequences, where the distribution of events is considered to be equally probable [1]. Today, entropy plays a basic role in the definitions of information measures with various applications in different areas. It has been recognized as the fundamental important field intersecting with mathematics, communication, physics, computer science, economics, and so forth [2–5].

The generalized information theory arising from the study of complex systems was intended to expand classical information theory based on probability. The additive probability measures, which are inherent in classical information theory, are extended to various types of nonadditive measures and thus result in different types of functionals that generalize Shannon entropy [6–8]. Generally, the formalization of uncertainty functions involves a considerable diversity. However, it also exhibits some unifying features [9].

##### 1.1. Entropies Defined on Density Functions

We consider as the continuous random variables (r.v.) over a state space with the joint density function and marginal density functions and . We also consider the conditional density function of given defined over . Note that , , and are also used to mean , , and , respectively, if their meanings are clear in context.

Let be a density function of r.v. with . The Khinchin axioms [10] are capable of obtaining the Shannon entropy in a unique way. However, this may be too narrow-minded if one wants to describe complex systems. Therefore, a generalized measure of an r.v. with respect to Kolmogorov-Nagumo (KN) averages [11] can be deduced as where is a continuous and strictly monotonic KN function [12] and hence has an inverse .

The KN averages can be extended in different manners to propose more generalized information measures. We use the -logarithm function [13] given as to replace the logarithm function in (1). Note that and satisfies pseudoadditivity; for example, . Hence we extend KN averages to a generalized measure of information with respect to -generalized KN averages defined by

In terms of Rényi’s generalization on axioms of KN averages [14], if and in (3), it yields Shannon entropy (SE) [15] defined as

Based on Shannon entropy, the Shannon mutual information (SMI) [15, 16] of r.v.s and was given by where is the joint Shannon entropy of and is the conditional Shannon entropy of given .

If and is chosen as in (3), it yields Rényi entropy (RE) [14] defined by where and .

Shannon entropy and Rényi entropy are additive. If and in (3), we get the pseudoadditive entropy or Tsallis entropy (TE) [17] defined by where and .

We obtain and . Therefore, Rényi entropy and Tsallis entropy can be viewed as interpolation formulas of the Shannon entropy and Hartley entropy (). A relation between Rényi and Tsallis entropies can be easily deduced as

More recently, interest in generalized information measures increases dramatically in different manners. A respectable number of nonclassical entropies, rather than Shannon entropy, Rényi entropy, and Tsallis entropy, have already been developed in the study of complex systems.

The exponential entropy (EE) of order [18] was defined by where and .

We obtain and .

##### 1.2. Entropies Defined on Survival Functions

As narrated in [19], information measures defined on the density function suffer from several drawbacks, since the distribution function is more regular than the density function. Therefore, the cumulative residual entropy, which was defined on the cumulative distribution function or equivalently the survival function, was proposed as an alternative information measure of uncertainty.

Let be a nonnegative r.v. in . We use the notation to mean that for . The multivariate survival function of a nonnegative r.v. is given as where with .

If the density function is replaced by the survival function, is set as 1, and in (3), it yields the survival Shannon entropy (SSE) [19] defined as

Since eight different types of entropies and their corresponding similarity measures will be discussed subsequently, it is worth pointing out that some notations and names of the existing information measures will be changed in harmony with the unifying frameworks throughout this paper.

To consider the conditional survival entropy, we denote as the conditional distribution function of given and also as the respective conditional survival function.

The cross survival Shannon entropy (CSSE) of r.v.s was given by [19] where is the conditional survival Shannon entropy of r.v.s given defined as [19] and here is the expectation with respect to an r.v. . The nonnegativity of CSSE was proven in [19] and thus CSSE was used as a similarity measure in image registration [20]. The generalized versions of SSE in dynamic systems were discussed in [21, 22].

If the density function in (9) is replaced by the survival function, this yields the survival exponential entropy (SEE) [23] of an r.v. with order given by where and .

As an ongoing research program, generalized information measure offers us a steadily growing inventory of distinct entropy theories. Diversity and unity are two significant features of these theories. The growing diversity of information measures makes it increasingly more realistic to find a certain information measure suitable for a given condition. The unity allows us to view all available information measures as a whole and to move from one measure to another as needed. To that end, motivated by the researching approaches on Shannon entropy, Shannon mutual information [2], SSE [19], and SEE [23], we attempt to study information-theoretic metrics in their manifestations. On one hand, we propose several new types of entropies and their similarity measures; on the other hand, for each type of the existing entropies, except for Shannon entropy, we give the definitions of similarity measures (see Tables 1 and 2). Finally, we deduce the unifying frameworks for information measures emerging from the study of complex systems based on probability.