Information and Modeling in Complexity 2013View this Special Issue
New Inequalities between Information Measures of Network Information Content
We refine a classical logarithmic inequality using a discrete case of Bernoulli inequality, and then we refine furthermore two information inequalities between information measures for graphs, based on information functionals, presented by Dehmer and Mowshowitz in (2010) as Theorems 4.7 and 4.8. The inequalities refer to entropy-based measures of network information content and have a great impact for information processing in complex networks (a subarea of research in modeling of complex systems).
We first present the classical Shannon’s entropy; see [1, 2]. Let be a discrete random variable taking values in , and let be the probability mass function of ; then, the Shannon entropy of is defined by We continue with some basic theoretical information about graphs; see [3, 4]. We consider to be a finite undirected graph, where . is connected if for any arbitrary vertices exists an undirected path from to . Furthermore, we will consider to be the set of finite undirected and connected graphs. This is usually used in strategies for improving transport efficiency in Network Information Content, including designing efficient routing strategies and making appropriate adjustments to the underlying network structure .
Now, taking into account the work presented in [6, 7], we define the concept of information functional. Let , and let initially be an abstract set; then, is called the information functional of (we assume that is monotonous). The abstract set can be, for example, the vertex sets, a set of paths, certain subgraphs, and so forth. The set is used to define the functional that captures the structural information of graph and has to be defined concretely.
Using different information functionals to measure the entropy of a graph, we obtain different probability distributions; so, the resulting graph entropies are also different. Further, we define, as in [6, 7], the entropy of a graph with arbitrary vertex labels , using an arbitrary information functional , to be and considering a vertex , interpreted as vertex probabilities , we conclude that
2. Refinement of a Classical Logarithmic Inequality
It is well known from the literature that Trying to refine this inequality, we consider the classical discrete Bernoulli inequality, as follows.
Theorem 1 (Bernoulli inequality). Let and . Then, if or , and if , or , then Now, using a factorization of a real number, one concludes as follows.
Theorem 2. Let be a real number; then, where and , with if , if , and if , .
Proof. The inequality becomes an identity for ; that is, . We will treat the other two cases, when and when in the same manner, as follows: for which applying the classical logarithmic inequality for yields and making use of the second part of the discrete Bernoulli inequality, previously presented, we obtain and we are done.
3. Refinement of Dehmer and Mowshowitz Inequalities between Information Measures
In the work of Dehmer and Mowshowitz , by and 4.8 are presented two information inequalities derived assuming only the characteristic properties of the functions involved. Here, we refine those results by using the previous theorem, as follows.
Theorem 3. Let and be information functionals. Then (all logarithms are in base 2), where and , with , if , if , and if , .
Proof. Applying Theorem 2 for yields that By multiplying this inequality first with and then with , we obtain Finally, summating after yields the wanted results.
As a main result of this paper, we improved two inequalities between entropy-based measures, this giving us a way of bounding measures of network properties. A good example for this purpose is presented into , where, using concrete information functionals, the corresponding entropies for the graphs in a set of 2265 nonisomorphic chemical graphs, that is, in MS2265, are computed. Here, new stronger lower and, respectively, upper bounds for entropy-based measures are given and thus, by knowing the limitations of such measures, we narrow down the set of expected values. The utility of inequalities between measures of structural information content is found in problem solving through problem transformation, as is also spotted in .
This work was supported and founded by the project “ERRIC-Empowering Romanian Research on Intelligent Information Technologies/FP7-REGPOT-2010-1”, ID: 264207.
R. Ash, Information Theory, Interscience, New York, NY, USA, 1965.View at: MathSciNet
T. M. Cover and J. A. Thomas, Elements of Information Theory, John Wiley & Sons, 2006.View at: MathSciNet
F. Buckley and F. Harary, Distance in Graphs, Addison-Wesley, 1990.View at: MathSciNet
F. Harary, Graph Theory, Addison-Wesley, Reading, Mass, USA, 1969.View at: MathSciNet