Abstract

Jensen’s and its related inequalities have attracted the attention of several mathematicians due to the fact that Jensen’s inequality has numerous applications in almost all disciplines of mathematics and in other fields of science. In this article, we propose new bounds for the difference of two sides of Jensen’s inequality in terms of power means. An example has been presented for the importance and support of the main results. Related results have been given in quantum calculus. As consequences, improvements of quantum integral version of Hermite-Hadamard inequality have been derived. The obtained inequalities have been applied for some well-known inequalities such as Hermite-Hadamrd, Hölder, and power mean inequalities. Finally, some applications are given in information theory. The tools performed for obtaining the main results may be applied to obtain more results for other inequalities.

1. Introduction

There is no doubt that one of the most important classes of functions is the class of convex functions. The beauty of convex functions is due to its unique graphical representation, geometrical interpretation, and developments in the theory of inequalities. There are numerous applicable inequalities which have been established for this class of functions such as Jensen’s, the Jensen-Steffensen, and majorization inequalities [1, 2]. One of the most important and widely applicable inequalities which has attracted the attention of many mathematicians is the Jensen inequality [2 , p. 43]. According to this inequality, if is a convex function and for each with , then

The inequality in (1) flips when the function is concave.

The integral version of the Jensen inequality is presented in the following theorem [3].

Theorem 1. Suppose that is an interval and be functions such that for Let the function be convex and the functions be integrable. Also, assume that for all and then

This inequality has been utlized in Economics [4], Engineering [5], Optimization [6], Finance [7], Statistics [8], and Information theory [3, 9, 10]. For a particular convex function, this inequality provides many other inequalities such as AM-GM, Hölder and Ky Fan inequalities.

In the literature, discrepancy between the two sides of Jensen’s inequality has been studied by several mathematicians in different directions which provides error bounds for certain approximations. In 2015, Costarelli and Spigler [11] studied the discrepancy between the right and left sides of Jensen’s inequality by using convex functions from the class of functions as well as the class of merely Lipschitz continuous functions. Some illustrative examples are presented and compared the bounds with some existing bounds. In 2018, Pearić et al. [12] focused to find the bounds for the difference of two sides of Jensen’s inequality. They considered some Green convex functions and their related identities and derived bounds for the Jensen gap for the class of functions without using convexity condition. In particular, they have applied the Hölder inequality. Related discrete results have also been obtained, and several applications in information theory are presented. In 2020, Khan et al. [3] introduced a new method for the derivation of bounds using higher-order convex functions that is -convexity. First of all, they obtained an identity for the Jensen difference in terms of Green convex functions and double derivative of a function. Further, they obtained bounds by using -convexity and some properties of absolute function. Some examples are considered for their main results and compared with earlier bounds. Also, several applications for some well-known inequalities such as Hölder and Hermite-Hadmard inequalities have been given. At the end, several applications in information theory also presented. Related discrete results are given in [13]. In 2021, Khan et al. [14] further modified the method given in [3, 13] and derived several results for Jensen and related inequalities. In this method, they have used real weights and found the integrals of some functions which pertaining Green functions, in a very simple way with the help of the obtained identity for the Jensen gap. By virtue of this procedure, they were able to obtain bounds for the Jensen-Steffensen and converse of Jensen’s inequalities. For more interesting results related to the celebrated Jensen’s inequality, we recommend [1517].

The main results of this manuscript utilize power means and its related inequalities; therefore, we want to mention them in the following part of this section. The following well-known power means and their monotonicity are given in [18, p.19]:

For two positive real -tuples and , the power mean of order is defined bywhere .

For the above -tuples, the quasiarithmetic mean is defined bywhere is a strictly monotone and continuous function.

The integral power mean can be defined as follows: If and are integrable functions, then the integral power mean of order is defined by

If , then

The main aim of this paper is to obtain new interesting bounds for the discrepancy of the two sides of the Jensen inequality using new tools. The bounds pertain power means. We give an example, which shows that the bounds obtained in this paper are better than the earlier ones. We proved Jensen’s inequality for quantum integrals and also derived its improvements. As applications, Hermite-Hadamard inequality and its improvements have been deduced for -integrals. We also give applications for some well-known inequalities such as Hermite-Harmard, Hölder, power, and quasiarithmetic mean inequalities. At the end, we focused to give applications for Shannon-entropy, Csiszár, and Zipf-Mandelbrot entropy etc.

2. Main Results

We begin by presenting our first major finding.

Theorem 2. Let be a convex function, for with . Then, for and , the following inequalities hold:where and with for .

Proof. It is obvious thatSince the function is convex, therefore, we haveHence, by the power mean inequality (6), we haveand similarly,Using (11) and (12) in (9), we obtain (8).

In the theorem below, we give an integral version of the above theorem.

Theorem 3. Consider the interval and the convex function and let be integrable functions such that for all and . Then, for and , we havewhere with , .

The next main result is presented in the following theorem.

Theorem 4. Let be a convex function, and for with . Then, for and , we havewhere with for and

Proof. By simple calculation, we can writeSince the function is convex, therefore, we haveAs , so from (15), we can writeNow, by considering the weights and applying power mean inequality (6), we haveand similarly,Using (18) and (19) in (17), we obtain (14).

In the forthcoming theorem, we present the integral version of the above theorem.

Theorem 5. Consider be a convex function and be integrable functions such that for all and . Then, for and , the following inequalities hold:where and with , .

In the following example, we compare our new bound with earlier bounds of the Jensen gap.

Example 1. Let the functions be defined by , and for all . Then,and , .
Now, we calculate the right hand side of (13) for :For the same functions, the bound for the Jensen gap from inequality (5) in [3] is . Also, in bounds for the Jensen gap from the inequalities (6) and (11) in [11], we have and , respectively.
Hence, we concluded that for That is, for these functions, the bound obtained in (13) is better than the earlier bound obtained in [3, 11].
As is increasing with respect to , therefore for more better estimate, we can find for . For example, is given byFor inequality (20), we consider the above functions with . For these functions, the value of the Jensen difference is , and the value of the bound for the Jensen gap from the inequality (5) in [3] is: . Now, we calculate the right hand side of (20).So, . Similarly, we can take more better bound by considering that isHence, in this case, we concluded that the bound obtained in (20) is better than the earlier bound obtained in the inequality (5) in [3].

3. Jensen’s Type Inequalities in Quantum Calculus

The -calculus or quantum calculus deals with the study of calculus without utlizing the idea of limits. The popular mathematician Euler proposed the ponder -calculus within the 18th century, while he introduced the term in Newton’s work of infinite series. Jackson has begun a symmetric study of -calculus and presented -definite integrals in twentieth century [19]. The field of -calculus has various interesting applications in several branches of Physics, Mathematics, and in other areas [20, 21]. This field has gotten extraordinary attention by numerous researchers, and a lot of research is devoted to this field. In [22], the authors defined -analogue operator of Ruscheweyh type involving multivalent functions and derived several properties. By using the newly presented Harmonic -Starlike class of functions, some important problems such as distortion limits, necessary and sufficient conditions, convolutions and convexity, and problems with partial sums have been studied in [23]. Srivastava et al. [24] applied the idea of a particular advanced convolution -operator together with the concept of convolution and analyzed two new classes of meromorphically harmonic functions.

First, we give some preliminaries which are useful in our results. Throughout this section, belongs to . Also, we assume that all the series which are used in this section are convergent.

Definition 6 (see [25]). Let be a continuous function and Then, the -derivative of the function at is denoted by and defined by

We say that is -differentiable on if exists for all .

Definition 7 (see [25]). Let be a continuous function. Then, the -integral on is defined asfor . Moreover, if , then the -integral on is defined as

Remark 8. From Definitions 6 and 7, we make the following remarks:(1)By taking , the expression in (26) becomes the well-known -derivative, , of the function defined by(2)Also, if , then (27) reduces to the classical -integral of a function defined by

Now, we present Jensen’s inequality for -integrals.

Theorem 9. Let be a continuous convex function defined on the interval and be a continuous function. Then,

Proof. From the convexity of , we haveTaking for and , in (32), we obtainMultiplying both sides of (33) by and then taking summation over , we getSince , therefore, from (34), we havewhich is equivalent to (31).

As a consequence of the above theorem, we deduce Hermite-Hadamard inequality for quantum integral. This inequality has been proved in [26, 27].

Corollary 10. Let be a continuous convex function. Then,

Proof. If , thenHence, using in (31), we obtain (36).

Now, we give improvement of quantum integral version of Jensen’s inequality in terms of means.

Theorem 11. Let be a continuous convex function defined on the interval and be a continuous function. Then, for and , the following inequalities hold:where with .

Proof. From the proof of Theorem 9, we see thatBy convexity of , we have . Also, as so by geometric series we have that is and hence is the arithmetic mean. Now, by using power mean inequality in (39), we obtainSimilarly, we can prove the left inequality in (42).

As an application of the above theorem, we deduce improvement of quantum integral version of Hermite-Hadamard inequality.

Corollary 12. Let be a continuous convex function. Then, for and , the following inequalities hold:where .

Proof. Using in (42), we obtain (41).

In the following theorem, we present another improvement of quantum integral version of Jensen’s inequality in terms of means.

Theorem 13. Let be a continuous convex function defined on the interval and be a continuous function. Then, for and , the following inequalities hold:where with

Proof. The proof can be given by a similar way as the proof of Theorem 11.

As an application of Theorem 13, we deduce improvement of quantum integral version of Hermite-Hadamard inequality.

Corollary 14. Let be a continuous convex function. Then, for and , the following inequalities hold:where .

Proof. Using in (42), we obtain (43).

4. Applications for Some Well-Known Inequalities

In this section, we demonstrate improvements of some well-known inequalities. We start with the Hermite-Hadamard inequality.

Corollary 15. Let be a convex function and , , thenwhere and is the constant function.

Proof. Applying Theorem 5 for , we obtain (44).

In the corollary below, we give improvements of power means inequality.

Corollary 16. Let and be positive -tuples with and such that , and .(i)If and , then(ii)If , then the reverse inequalities hold in (45)(iii)If and , then(iv)If , then the reverse inequalities hold in (46)where with and with for .

Proof. (i)If , and then is convex. Therefore, applying Theorem 4 for this function and , we obtain (45)(ii)If , then is a concave function, so applying Theorem 4 for concave function, we deduce the reverse inequality in (45)(iii)If , and then is convex. Therefore, applying Theorem 4 for this function and , we obtain (46)Similarly, we can prove the reverse inequality in (46).

Corollary 17. Let and be positive -tuples with and such that , . Let the function be monotone continuous function and be convex function, thenwhere with for .

Proof. The inequality (47) can be obtained by using Theorem 2 for and for .

The following two results are devoted to improvements of Hölder inequality.

Proposition 18. Let be positive -tuples and be such that and with , then

Proof. Let , then clearly is convex for and , therefore, using the right inequality in (8) for , and and then simplifying, we derive thatSince the inequality holds for and therefore by putting , , and , we obtainNow, (49) and (50) give (48).

Corollary 19. Let be positive -tuples and let be such that and , thenwhere and with

Proof. As the function is convex for . So by using (8) for , and , we get (51).

Remark 20. Similarly, we can present applications of the second main results. Also, we can give integral version of the above results.

5. Applications in Information Theory

Information theory studies how to measure, store, and transmit digital information. The field of information theory was initially entrenched by Hartley’s work in 1920 and got worldwide attention by the work of Shannon in 1940s. The field of information theory is in close collaboration with probability theory, statistical mechanics, information, and electrical engineering. A fundamental measure in information theory is entropy that measures the uncertainty involved in the occurrence of a random process. After Shannon’s work, the field attracted the concentration of various scientists and got inflated. Different entropy functional have been introduced that can be elaborated as generalized entropies.

Definition 21. The Csiszár-divergence for two positive -tuples , is defined bywhere is a convex function.

Theorem 22. Let , be positive -tuples and be a convex function. If and , thenwhere with for .

Proof. Using (8) for , , and for , we obtain (53).

Definition 23 (Shannon entropy). Letbe positive probability distribution, then the Shannon entropy is defined by

Corollary 24. Let be a positive probability distribution. If and , thenwhere with for .

Proof. Taking , , for each , in (53), we obtain (55).

Corollary 25. Let , be positive -tuples with . If and , thenwhere with for .
In particular, if for each , thenwhere and with for .

Proof. Taking , for each , in (53), we obtain (56).

Definition 26. The Kullback-Leibler divergence for two positive probability distributions and is defined by

Corollary 27. Let , be positive probability distributions, then for and , we havewhere with for .

Proof. Taking , in (53), we obtain (59).

Corollary 28. Assume that all the assumptions of Corollary 27 hold. Then,where with for .

Proof. Taking , in (53), we obtain (60).

Definition 29. If and are two positive probability distributions, then the variational distance is defined by

Corollary 30. Assume that all the assumptions of Corollary 27 hold. Then,where with for .

Proof. Using the function , in (53), we obtain (62).

Definition 31. If and are two positive probability distributions, then the Jeffrey distance is defined by

Corollary 32. Assume that all the assumptions of Corollary 27 hold. Then,where with for .

Proof. Applying the function , in (53), we obtain (64).

Definition 33. If and are positive probability distributions, then the Bhattacharyya coefficient is defined by

Corollary 34. Assume that all the assumptions of Corollary 27 hold, thenwhere with for .

Proof. Using the function , in (53), we obtain (66).

Definition 35. If and are positive probability distributions, then the Hellinger distance is defined by

Corollary 36. Under the assumptions of Corollary 27, the following inequality holds:where with for .

Proof. Using the function , in (53), we obtain (68).

Definition 37. For two positive probability distributions , , the triangular discrimination is defined by

Corollary 38. Assume that all the assumptions of Corollary 27 hold, thenwhere with for .

Proof. As the function is convex. Therefore applying the function , in (53), we obtain (70).

Now, we begin to give inequalities for Zipf-Mandelbrot entropy. Before presenting the results, we include some introductory part about Zipf-Mandelbrot entropy.

Zipf’s law is a fundamental and helpful law in the field of information science. In the field of social studies, this law was regarded as a valuable statistical distribution procedure. The law demonstrate the relation between the size and the rank of discrete phenomena. Furthermore, this law is also applied to the intensity of solar flares, earth quack magnitude, geology, city populations, the size of moon craters, and website traffic etc. It has had limited success in the realm of geology when it comes to assessing petroleum and other mining.

Mandelbrot invented the Zipf-Mandelbrot law in 1966, which is a further enhancement of Zipf’s law [28]. The low-rank words in the corpus are taken into account by this law, where [29]: , if one put ; the law of Zipf will then be deduced. There are numerous applications of Zipf-Mandelbrot law, which are available in linguistics [29, 30] and information sciences [31] and is also generally useful in ecological field studies [32]. The Zipf-Mandelbrot entropy is given bywhere , , and the Zipf-Mandelbrot law is given by:

Now, we are in position to give inequalities for Zipf-Mandelbrot entropy.

Corollary 39. Let , , with . If and , thenwhere with for .

Proof. Let , , then

Since , therefore, . Hence, using (59) for , we obtain (72).

Utilizing two Zipf’s law associated to distinct parameters, we derive estimation for Zipf-Mandelbrot entropy.

Corollary 40. Let , . If and , thenwhere with , .

Proof. Let and , , then as in the proof of Corollary 39, we haveAlso, and
Therefore using (59) for and , we obtain (74).

Remark 41. Similarly, we can use (60) and derive inequalities for Zipf-Mandelbrot entropy.

6. Conclusion

In the literature, there are several results which are devoted to Jensen’s and its related inequalities. In this manuscript, we have used a new approach for the derivation of improvements of Jensen’s inequality. We have given Jensen’s inequality and its improvements in quantum calculus. In particular, we have deduced Hermite-Hadamard type inequalities for -integrals. We have also given applications of main results for some well-known inequalities and in information theory. The results of this manuscript which are initiated and given in quantum calculus may stimulate further research.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Authors’ Contributions

All authors contributed equally and significantly in this paper. All authors read and approved the final manuscript.