Abstract

In this paper, we present a new refinement of the integral Jensen inequality by utilizing certain functions and give its applications to various means. We utilize the refinement to obtain some new refinements of the Hermite-Hadamard and Hölder’s inequalities as well. Also, we present its applications in information theory. At the end of this paper, we give a more general form of the proposed refinement of the Jensen inequality, associated to several functions.

1. Introduction

Being an important part of modern applied analysis, the field of mathematical inequalities has recorded an exponential growth with significant impact on various parts of science and technology [15]. These inequalities are also extended and generalized in various aspects; one can see such results in [615]. The Jensen weighted integral inequality is a central tool among them; its basic form is follows as [16].

Theorem 1. Assume a convex function and are measurable functions such that and . Also, suppose that are all integrable functions on and , then

The Jensen inequality is one of the fundamental inequalities in modern applied analysis. This inequality is of pivotal importance because various other classical inequalities, for example, the Beckenbach-Dresher, Minkowski’s, the Hermite-Hadamard, Ky-Fan’s, Hölder’s, the arithmetic-geometric, and Levinson’s and Young’s inequalities, can be deduced from this inequality. Also, this inequality can be treated as a problem solving oriented tool in different areas of science and technology, and an extensive literature is dedicated to this inequality regarding its counterparts, generalizations, improvements, and converse results (see, for instance, [1721]) and the references therein.

The Hermite-Hadamard inequality is presented as follows ([22], page 10 in [23]).

Theorem 2. Assume a convex function , then the following double inequalities hold:

The Hölder inequality in its integral form is presented as follows [23].

Theorem 3. Let be such that and assume two measurable functions say such that the functions and are integrable on . Then

The remaining paper is organized in the following manner: Section 2 proposes a new refinement of the integral Jensen inequality, associated to four functions whose sum is equal to unity in pairs. Utilizing this refinement, we derive some new refinements of the Hölder and Hermite-Hadamard inequalities and some new inequalities for power and quasi-arithmetic means. In Section 3, we focus to deduce inequalities for the Csiszâr divergence, variational distance, Shannon entropy, and Kullback-Leibler divergence. In the last section, we present a more general form of the proposed refinement concerning several certain functions.

2. Main Results

Assuming a real valued convex function defined on the interval and suppose that are some integrable functions with the following conditions with , for all and . Also, let for a nonempty subinterval of , we put ; then, for the above facts, we can define the following functional

We give the following refinement of the integral Jensen inequality associated to four functions.

Theorem 4. Let be a convex function defined on the interval . Also, let be some integrable functions with the following conditions such as with , for all and . Then, for any nonempty subinterval of with the following inequalities hold For a concave function , the reverse inequalities hold in (5).

Proof. Since , for all , therefore for the subinterval of with , we have Multiplying equation (6) by and assigning it to the function , then by convexity of , we have Also, by making use of the integral Jensen inequality, one has From (7) and (8), we get (5).

Remark 5. The following is an equivalent form of the inequality (5)

From Theorem 4, we obtain a new refinement of the H-H inequality as follows.

Corollary 6. Let be a convex function defined on . Also, let be integrable functions such that with and for all Then, for a subinterval of with , the following inequalities hold The direction of the inequalities reverses in (11), when the function becomes concave.

Proof. Utilizing Theorem 4 for for all , we get (11).

From Theorem 4, we deduce the following refinement of Hölder’s inequality.

Corollary 7. If and the functions and defined on are nonnegative such that the functions and are integrable with and for all . Also, if is a subinterval of with , then (A)For such that , the following inequalities hold(B)For and with , the following inequalities hold(C)For and with , the inequalities in (13) hold

Proof. (A)In the case when , let . Then, by using Theorem 4 for , , , and , we obtain (12). Also, let ; then, applying the same procedure as above and replacing by , respectively, we obtain (12)For and , the inequalities in (12) also hold. This can be proved as follows, since we know that Taking integral, then with the proposed conditions, we obtain , which concludes the result. (B)In the case when , we have . So, applying (12) for , , , and , we obtain (13)(C)In the case when , we have . So, applying the arguments of part B with replacing by , respectively, we get (13)

The Hölder inequality is refined by the following corollary.

Corollary 8. Let and be nonnegative functions defined on such that and are integrable functions with and for all . Also, assume that is a subinterval of with , then (A)For and , the following inequalities hold(B)For and with , the following inequalities hold(C)For and with , the result in (16) holds

Proof. (A)Let , which is clearly a concave function for . Thus, by using Theorem 4 for , , and , we obtain (15). If , then adopting the same procedure and replacing by , respectively, we obtain (15)For and , the inequalities in (12) also hold. This can be proved as follows, since we know that Hence, taking integral and using the proposed conditions, we get , which verifies the result. (B)In the case when , applying (15) for , , , and , we obtain (16)(C)In the case when , we have which shows that this case reflects case B; therefore, applying the arguments of case B with replacing by , respectively, we get (16)

Let and be positive integrable functions defined on and be any nonempty subinterval of , then the integral power means of order are defined as

Corollary 9. Assume some positive integrable functions and defined on with and for . Also, assume that such that , then (A)For , the following inequalities hold(B)For , the following inequalities hold

Proof. (A)Let for , then the following possible cases can be discussed:Case 1. If with , then , and the function is convex. Therefore, utilizing (5) for and , after that taking power , we obtain (20)
Case 2. If with , then , and the function is concave. Therefore, utilizing (5) for and and then taking power , we obtain (20)
Case 3. If , with , then , and the function is convex. Therefore, utilizing (5) for and and then taking power , we obtain (20)
For the case when , taking of (20), we get (21). (B)Let for , then the following possible cases can be discussed:Case 1. If with , then , and the function is convex. Hence, using (5) for and and then taking power , we obtain (22)
Case 2. If with , then , and the function is concave. Hence, using (5) for and and then taking power , we obtain (22)
Case 3. Similarly, if , with , then , and the function is convex function. Hence, using (5) for and and taking power , we obtain (22)
For the case when , taking in (22), we get (23).

Let and be some positive integrable functions and be an arbitrary integrable function defined on . Further, if is considered as a strictly monotone and continuous function whose domain is the image of , then for any nonempty subinterval of , the quasi-arithmetic mean is defined by

Some inequalities are given for the quasi-arithmetic mean as follows.

Corollary 10. Assume some positive integrable functions defined on such that and for and further assume that is an arbitrary integrable function defined on . Also, suppose that is a strictly monotone continuous function whose domain is the image of . Then, for as a convex function, the following inequalities hold The direction of the inequalities reverses in (26), when the function , becomes concave.

Proof. The desired inequalities can be calculated by utilizing (5) for and .

3. Applications in Information Theory

In this section, we use the main result to obtain some new and interesting estimates for various divergences and Shannon entropy in information theory. A literature about the inequalities related to these divergences can be found in [24].

Definition 11 (Csiszâr divergence [25]). Assume that is a function defined on a positive interval . Also, assume that are two integrable functions such that for all , then the integral form of Csiszâr-divergence is defined by

Theorem 12. Let be a convex function defined on a positive interval and assume that are integrable functions such that , , and for all . Then, for a subinterval of with , the following inequalities hold

Proof. Using Theorem 4 for , , and , we obtain (28).

Definition 13 (Shannon entropy). The Shannon entropy for a positive probability density function defined on is given by

Corollary 14. Let be integrable functions such that and for all . Also, let be two positive probability density functions defined on then for a subinterval of with , the following inequalities hold

Proof. Using (28) for the function , , we obtain (30).

Remark 15. For , the result (30) becomes

Definition 16 (Kullback-Leibler divergence). The Kullback-Leibler divergence for two positive probability densities and defined on is given by

Corollary 17. Let be integrable functions such that and are positive probability density functions and and for all . Then, for a subinterval of with the following inequalities hold

Proof. Using (28) for the convex function , , we obtain (33).

Definition 18 (variational distance). The variational distance for two positive probability densities and defined on is given by

Corollary 19. Let and be interpreted as in Corollary 17, then

Proof. Using the convex function for in (28), we obtain (35).

Definition 20 (Jeffrey’s distance). Jeffrey’s distance for two positive probability density functions and defined on is given by

Corollary 21. Let and be interpreted as in Corollary 17, then

Proof. Using the convex function in (28), we obtain (37).

Definition 22 (Bhattacharyya coefficient). The Bhattacharyya coefficient for two positive probability density functions and defined on is given by

Corollary 23. Let and be interpreted as in Corollary 17, then

Proof. Using the convex function , in (28), we obtain (39).

Definition 24 (Hellinger distance). The Hellinger distance for two positive probability density functions and defined on is given by

Corollary 25. Let and be defined as in Corollary 17, then

Proof. Using (28) for the convex function we obtain (41).

Definition 26 (triangular discrimination). The triangular discrimination for two positive probability density functions and defined on is given by

Corollary 27. Let and be as stated in Corollary 17, then

Proof. Utilizing the convex function , in (28), we obtain (43).

3.1. Further Generalization

In this section, we give more general form of the proposed refinement of the integral Jensen inequality concerning several certain functions.

Theorem 28. Let be a convex function defined on the interval . Also, let be integrable functions for each where for all and , for each . Suppose that be some nonempty subsets of such that for and . Furthermore, if are some nonempty subintervals of such that for and , then the following inequalities hold: If the function is concave, then the reverse inequalities hold in (44).

Proof. Since for all and each therefore for the subintervals of , we have Applying the integral Jensen inequality to all terms on the right hand side of (45), we obtain which concludes the result (44).

Remark 29. Using Theorem 28 for , we obtain Theorem 4. Also, one can present similar applications of Theorem 28 as in the previous sections.

4. Conclusion

Jensen’s inequality and its refinements can help in various aspects; for example, it helps to authenticate the positivity of Kullback-Leibler divergence, it can be used to obtain some useful estimates for Shannon and Zipf-Mandelbrot entropies and for various divergences in information theory, its gap can be utilized to obtain error bounds in the estimation of certain parameters, and it is useful in the stability analysis of discrete and continuous-time systems with time-varying delay. In this paper, a new refinement of the integral Jensen inequality is proposed with the help of four special type of functions. The refinement is utilized to obtain improved inequalities for various means. Also, new refinements of the Hölder and the Hermite-Hadamard inequalities are obtained. New estimates for several divergences and Shannon entropy are presented. At the end of this paper, more general form of the proposed refinement of the Jensen inequality associated to several special type of functions is established.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Authors’ Contributions

All authors contributed equally to writing of this paper. All authors read and approved the final manuscript.