International Scholarly Research Notices

Volume 2014, Article ID 820375, 9 pages

http://dx.doi.org/10.1155/2014/820375

## Bounds on Nonsymmetric Divergence Measure in terms of Other Symmetric and Nonsymmetric Divergence Measures

^{1}Department of Mathematics, Malaviya National Institute of Technology, Jaipur, Rajasthan 302017, India^{2}B-1, Staff Colony, MNIT, Jaipur, Rajasthan 302017, India

Received 5 June 2014; Accepted 5 September 2014; Published 29 October 2014

Academic Editor: Angelo De Santis

Copyright © 2014 K. C. Jain and Praphull Chhabra. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Vajda (1972) studied a generalized divergence measure of Csiszar’s class, so called “Chi- divergence measure.” Variational distance and Chi-square divergence are the special cases of this generalized divergence measure at and , respectively. In this work, nonparametric nonsymmetric measure of divergence, a particular part of Vajda generalized divergence at , is taken and characterized. Its bounds are studied in terms of some well-known symmetric and nonsymmetric divergence measures of Csiszar’s class by using well-known information inequalities. Comparison of this divergence with others is done. Numerical illustrations (verification) regarding bounds of this divergence are presented as well.

#### 1. Introduction

Let be the set of all complete finite discrete probability distributions. If we take for some , then we have to suppose that .

Csiszar [1] introduced a generalized -divergence measure, which is given by where (set of real numbers) is a real, continuous, and convex function and , , where and are probability mass functions. Many known divergences can be obtained from this generalized measure by suitably defining the convex function . Some of those are as follows.

##### 1.1. Symmetric Divergence Measures

Symmetric measures are those that are symmetric with respect to probability distributions . These measures are as follows (see [2–7]):

##### 1.2. Nonsymmetric Divergence Measures

Nonsymmetric measures are those that are not symmetric with respect to probability distributions . These measures are as follows (see [8–10]): We can see that , , , and , where is the harmonic mean divergence, is the geometric mean divergence, and is the relative JS divergence [5]. Equations (3), (4), and (8) are also known as Kolmogorov’s measure, information radius, and directed divergence, respectively.

#### 2. Nonsymmetric Divergence Measure and Its Properties

In this section, we obtain nonsymmetric divergence measure for convex function and further define the properties of function and divergence. Firstly, Theorem 1 is well known in literature [1].

Theorem 1. *If the function is convex and normalized, that is, , then and its adjoint are both nonnegative and convex in the pair of probability distribution .*

Now, let be a function defined as Properties of the function defined by (10) are as follows.(a)Since is a normalized function.(b)Since for all is a convex function as well.(c)Since at and at is monotonically decreasing in and monotonically increasing in and , .

Now put in (1); we get the following new divergence measure: is called adjoint of (after putting conjugate convex function of in (1)) and it is a particular case of “Chi- divergence measure [11]” at , which is given by , .

Properties of the divergence measure defined in (11) are as follows.(a)In view of Theorem 1, we can say that is convex and nonnegative in the pair of probability distribution .(b) if or (attaining its minimum value).(c)Since is a nonsymmetric divergence measure.

#### 3. Information Inequalities of Csiszar’s Class

In this section, we are taking well-known information inequalities on , which is given by Theorem 2. Such inequalities are, for instance, needed in order to calculate the relative efficiency of two divergences. This theorem is due to literature [12], which relates two generalized -divergence measures.

Theorem 2. *Let be two convex and normalized functions, that is, , and suppose the following assumptions.*(a)* and are twice differentiable on where , .*(b)*There exist the real constants , such that and
where for all .**If and satisfying the assumption , then one has the following inequalities:
**
where is given by (1).*

*4. Bounds in terms of Symmetric Divergence Measures*

*Now in this section, we obtain bounds of divergence measure (11) in terms of other symmetric divergence measures by using Theorem 2.*

*Proposition 3. Let and be defined as in (2) and (11), respectively. For P, , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (16), respectively, and
If , , and .

It is clear that is decreasing in and but increasing in .

Also has a minimum and maximum value at and , respectively, because and , so
And(a)if , then
(b)if , then
Results (14) and (15) are obtained by using (11), (17), (19), (20), and (21) in (13), after interchanging and .

*Proposition 4. Let and be defined as in (3) and (11), respectively. For , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (24), respectively, and
If and .

It is clear that is decreasing in and but increasing in .

Also has a minimum and maximum value at and , respectively, because and , so
And(a)if , then
(b)if , then
Results (22) and (23) are obtained by using (11), (25), (27), (28), and (29) in (13), after interchanging and .

*Proposition 5. Let and be defined as in (4) and (11), respectively. For , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (32), respectively, and
If , , and .

It is clear that is decreasing in and but increasing in .

Also has a minimum and maximum value at and , respectively, because and , so
And(a)if , then
(b)if , then
Results (30) and (31) are obtained by using (11), (33), (35), (36), and (37) in (13), after interchanging and .

*Proposition 6. Let and be defined as in (5) and (11), respectively.For , one has
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (39), respectively, and
It is clear that is always decreasing in , so
Result (38) is obtained by using (11), (40), and (42) in (13), after interchanging and .

*Proposition 7. Let and be defined as in (6) and (11), respectively. For , one has
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (44), respectively, and
It is clear that is always decreasing in , so
Result (43) is obtained by using (11), (45), and (47) in (13), after interchanging and .

*5. Bounds in terms of Nonsymmetric Divergence Measures*

*5. Bounds in terms of Nonsymmetric Divergence Measures**Now in this section, we obtain bounds of divergence measure (11) in terms of other nonsymmetric divergence measures by using Theorem 2.*

*Proposition 8. Let and be defined as in (7) and (11), respectively. For , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (50), respectively, and
If , , and .

It is clear that is decreasing in and but increasing in .

Also has a minimum and maximum value at and , respectively, because and , so
And(a)if , then
(b)if , then
Results (48) and (49) are obtained by using (11), (51), (53), (54), and (55) in (13), after interchanging and .

*Proposition 9. Let and be defined as in (8) and (11), respectively. For , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (58), respectively, and
If and .

It is clear that is decreasing in and but increasing in .

Also has a minimum and maximum value at and , respectively, because and , so
And(a)if , then
(b)if , then
Results (56) and (57) are obtained by using (11), (59), (61), (62), and (63) in (13), after interchanging and .

*Proposition 10. Let and be defined as in (9) and (11), respectively. For , one has the following.(a)If , then
(b)If , then
*

*Proof. *Let us consider
Since for all and , is a convex and normalized function, respectively. Now put in (1); we get
Now, let , where and are given by (10) and (66), respectively, and
If and .

It is clear that is decreasing in and increasing in .

Also has a minimum value at , because , so
And(a)if , then
(b)if , then
Results (64) and (65) are obtained by using (11), (67), (69), (70), and (71) in (13), after interchanging and .

*6. Numerical Illustration*

*6. Numerical Illustration**In this section, we give two examples for calculating the divergences , , , , and and verify inequalities (14), (22), (48), and (56) or verify bounds of .*

*Example 1. *Let be the binomial probability distribution with parameters (, ) and its approximated Poisson probability distribution with parameter , for the random variable ; then we have Table 1.

*By using Table 1, we get the following:
*

*
Put the approximated numerical values from (72) in (14), (22), (48), and (56) and verify inequalities (14), (22), (48), and (56) for .*

*Example 2. *Let be the binomial probability distribution with parameters (, ) and its approximated Poisson probability distribution with parameter , for the random variable ; then we have Table 2.

By using Table 2, we get the following:
Put the approximated numerical values from (73) in (14), (22), (48), and (56) and verify inequalities (14), (22), (48), and (56) for .

Similarly, we can verify inequalities (or verify bounds of ) (30), (38), (43), and (64).

*Figure 1 shows the behavior of convex function in . Function is decreasing in and increasing in . Figure 2 shows the behavior of , , , , and . We have considered , , where . It is clear from Figure 2 that the divergence has a steeper slope than , , , and .*

*7. Conclusions*

*7. Conclusions**Many research papers have been studied by Taneja, Kumar, Dragomir, Jain, and others, who gave the idea of divergence measures, their properties, their bounds, and relations with other measures. Taneja especially did a lot of quality work in this field: for instance, in [13] he derived bounds on different nonsymmetric divergences in terms of different symmetric divergences, in [14] he introduced new generalized divergences and new divergences as a result of difference of means and characterized their properties and bounds, and in [15] new inequalities among nonnegative differences arising from seven means have been introduced and correlations with generalized triangular discrimination and some new generating measures with their exponential representations have also been presented.*

*Divergence measures have been demonstrated to be very useful in a variety of disciplines such as anthropology, genetics, finance, economics and political science, biology, analysis of contingency tables, approximation of probability distributions, signal processing, pattern recognition, sensor networks, testing of the order in a Markov chain, risk for binary experiments, region segmentation, and estimation.*

*This paper also defines the properties and bounds of Vajda’s divergence and derives new relations with other symmetric and nonsymmetric well-known divergence measures.*

*Conflict of Interests*

*Conflict of Interests**The authors declare that there is no conflict of interests regarding the publication of this paper.*

*References*

*References*

- I. Csiszar, “Information type measures of differences of probability distribution and indirect observations,”
*Studia Scientiarum Mathematicarum Hungarica*, vol. 2, pp. 299–318, 1967. View at Google Scholar · View at MathSciNet - D. Dacunha-Castelle,
*Ecole d’Ete de Probabilites de Saint-Flour VII 1977*, Springer, Berlin, Germany, 1978. - E. Hellinger, “Neue begründung der theorie der quadratischen formen von unendlichen vielen veränderlichen,”
*Journal für Die Reine und Angewandte Mathematik*, vol. 136, pp. 210–271, 1909. View at Google Scholar - J. Burbea and C. R. Rao, “On the convexity of some divergence measures based on entropy functions,”
*IEEE Transactions on Information Theory*, vol. 28, no. 3, pp. 489–495, 1982. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus - R. Sibson, “Information radius,”
*Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete*, vol. 14, no. 2, pp. 149–160, 1969. View at Publisher · View at Google Scholar - K. C. Jain and R. Mathur, “A symmetric divergence measure and its bounds,”
*Tamkang Journal of Mathematics*, vol. 42, no. 4, pp. 493–503, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus - K. C. Jain and A. Srivastava, “On symmetric information divergence measures of Csiszar's f-divergence class,”
*Journal of Applied Mathematics, Statistics and Informatics*, vol. 3, no. 1, 2007. View at Google Scholar - S. S. Dragomir, V. Gluscevic, and C. E. M. Pearce, “Approximation for the Csiszar f-divergence via midpoint inequalities,” in
*Inequality Theory and Applications*, Y. J. Cho, J. K. Kim, and S. S. Dragomir, Eds., vol. 1, pp. 139–154, Nova Science Publishers, Huntington, NY, USA, 2001. View at Google Scholar - S. Kullback and R. A. Leibler, “On information and sufficiency,”
*Annals of Mathematical Statistics*, vol. 22, pp. 79–86, 1951. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - I. J. Taneja, “New developments in generalized information measures,”
*Advances in Imaging and Electron Physics*, vol. 91, pp. 37–135, 1995. View at Publisher · View at Google Scholar - I. Vajda, “On the $f$-divergence and singularity of probability measures,”
*Periodica Mathematica Hungarica*, vol. 2, pp. 223–234, 1972. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus - I. J. Taneja, “Generalized symmetric divergence measures and inequalities,”
*RGMIA Research Report Collection*, vol. 7, no. 4, article 9, 2004. View at Google Scholar - I. J. Taneja, “Bounds on non-symmetric divergence measures in terms of symmetric divergence measures,”
*Journal of Combinatorics, Information & System Sciences*, vol. 29, no. 1–4, pp. 115–134, 2005. View at Google Scholar · View at MathSciNet - I. J. Taneja, “On symmetric and non symmetric divergence measures and their generalizations,”
*Advances in Imaging and Electron Physics*, vol. 138, pp. 177–250, 2005. View at Publisher · View at Google Scholar · View at Scopus - I. J. Taneja, “Seven means, generalized triangular discrimination, and generating divergence measures,”
*Information*, vol. 4, no. 2, pp. 198–239, 2013. View at Publisher · View at Google Scholar · View at Scopus

*
*