Abstract

To avoid exorbitant and extensive laboratory experiments, QSPR analysis, based on topological descriptors, is a very constructive statistical approach for analyzing the numerous physical and chemical properties of compounds. Therefore, we presented some new entropy measures which are based on the sum of the neighborhood degree of the vertices. Firstly, we made the partition of the edges of benzene derivatives which are based on the degree sum of neighboring vertices and then computed the neighborhood version of entropies. Secondly, we made use of the software SPSS for developing a correlation between newly introduced entropies and the physicochemical properties of benzene derivatives. Our obtained results demonstrated that the critical temperature , critical pressure , and critical volume can be predicted through fifth geometric arithmetic entropy, second entropy, and fifth entropy, respectively. Other remaining physical characteristics include Gibb’s energy , , molar refractivity , and Henry’s law that can be predicted by using sixth entropy.

1. Introduction

In chemistry, models are classified into two types. The first type of modeling is based on the quantum-chemical method and theoretically identical models derived from statistical mechanics. The second type consists of chemical thinking, which compares related systems. According to the core principle of chemistry, all the features of matter correlate with its molecular structure, and thus, molecules with similar structures have similar properties. This has resulted in several empirical approaches such as SAR, SPC, QSPR, and QSAR [1, 2].

The identical factor of all these methods is the correlation between physicochemical properties and molecular descriptors. There are numerous descriptors to utilize [3], but one family of descriptors has been demonstrated exceptionally basic and valuable in foreseeing multiple molecular properties. These are known as topological indices . Topological indices are numerical parameters that are correlated with a graph and help identify its topology. There are three types of : degree-based , spectrum-based , and distance-based [4, 5]. Degree-based , which are defined in terms of the degrees of the vertices of a graph, are one of the most studied used in mathematical chemistry [6]. Imran et al. [7] discuss the topological properties of symmetric chemical structures. Zou et al. [8] computed topological indices for polyphenylene. Recently, some neighborhood versions of have also been introduced [9, 10]. Zhang et al. [1113] discuss the topological indices of generalized bridge molecular graphs, carbon nanotubes, and the product of chemical graphs.

The idea of entropy first appeared in thermodynamics in the nineteenth century, when it was closely linked to the heat flow and a key component of the second law of thermodynamics. Subsequently, statistical mechanics used the notion to illuminate thermodynamics physically. Leading physicists such as Boltzmann and Gibbs [14, 15], who employed entropy as a measure of the disorder of the massive dynamical system underlying molecule collections, were responsible for this. Fisher employed similar notions in the establishment of the foundations of theoretical statistics in 1920 [16]. In the 1950s, Kullback and Leibler [17] developed this concept further. Zhang et al. [1820] provided the physical analysis of heat for the formation and entropy of ceria oxide.

Claude Shannon, a mathematician and electrical engineer who worked at Bell Labs in New Jersey in the midtwentieth century, identified the link between entropy and information content [21]. The concept formed a key element of the emerging field of information theory at that time. Afterward, in the , Jaynes explained the explicit association between Shannon’s entropy and that of statistical mechanics in a series of excellent works [22, 23]. Since then, information theory has found applications in a variety of fields, including networking analysis, mathematical statistics, complexity theory, and financial mathematics.

We made a bibliometric analysis grounded on the Scopus database https://www.scopus.com. This analysis is based upon 986 research articles with entropy and graph entropy [2426] as key factors. The percentage of publications in different subjects is shown in Figure 1.

We made a bibliometric analysis of the research conducted in different countries on the concept of entropy in Figure 2.

In recent times, another approach which is a bit different in the literature, namely, using the concept of Shannon’s entropy in terms of topological indices was introduced by Manzoor et al. in [27]. Continuing their work, they also introduced eccentricity-based graph entropies [28, 29] and bond additive graph entropies [30]. In this paper, the present authors formulated some new graph entropies, namely, neighborhood versions of graph entropies. The graph entropy is represented in the following formula:where is the vertex set, is the edge set, and is the edge weight of the edge .(i)Neighborhood version of forgotten entropyIf , thenSo, equation (1) is called the neighborhood version of forgotten entropy.(ii)Neighborhood version of second Zagreb entropyIf , thenSo, equation (2) is called the neighborhood version of second Zagreb entropy.(iii)Neighborhood version of hyper-Zagreb entropyIf , thenSo, equation (3) is called the neighborhood version of hyper-Zagreb entropy.(iv)First entropyIf , thenSo, equation (4) is called the first entropy.(v)Second entropyIf , thenSo, equation (5) is called the second entropy.(vi)Third entropy.If , thenSo, equation (6) is called the third entropy.(vii)Fourth entropy.If , thenSo, equation (7) is called the fourth entropy.(viii)Fifth entropy.If , thenSo, equation (8) is called the fifth entropy.(ix)Sixth entropyIf , thenSo, equation (9) is called the sixth entropy.(x)Neighborhood version of entropyIf , thenSo, equation (10) is called the neighborhood version of entropy.(xi)Neighborhood version of first entropy.If , thenSo, equation (11) is called the neighborhood version of entropy.(xii)Neighborhood version of second entropy.If , thenSo, equation (12) is called the neighborhood version of entropy.(xiii)Neighborhood version of modified Randi entropy.If , thenSo, equation (13) is called the neighborhood version of modified Randi entropy.(xiv)Neighborhood version of redefined first Zagreb entropy.If , thenSo, equation (14) is called the neighborhood version of redefined first Zagreb entropy.(xv)Neighborhood version of redefined second Zagreb entropy.If , thenSo, equation (15) is called the neighborhood version of redefined second Zagreb entropy.(xvi)Neighborhood version of atom bond connectivity entropy.If , thenSo, equation (16) is called the neighborhood version of atom bond connectivity entropy [31].(xvii)Neighborhood version of geometric arithmetic entropy.If , thenSo, equation (17) is called the neighborhood version of geometric arithmetic entropy [31].

2. Curvilinear Regression Analysis of Proposed Entropies

In this section, we analyze the entropies given above with the following physical characteristics of the benzene derivatives [32, 33]: critical pressure , critical temperature , critical volume , Gibb’s energy , , molar refractivity , and Henry’s law . The experimental values of physical characteristics of benzene derivatives have been referred to from [34] and presented in Table 1. We have presented the values of the proposed indices for the benzene derivatives in Tables 2 and 3. Figures 3 and 4 illustrate the structure of benzene derivatives.

We analyze the topological indices vis a vis the physical characteristics using the following regression models:where is the physical property, is the entropy, and and represent the coefficient and constant, respectively. For the seven physicochemical properties, we found the correlation between the properties and the seventeen entropies proposed by us. We now present the analysis of the linear model based on the value. Based on the recommendations of the International Academy of Mathematical Chemistry (IAMC), we have only considered regression models with .

2.1. Linear Regression Models

Using equation (36), we obtained the linear regression models (LRM) for the seven physicochemical properties via each of the proposed indices, and the results are presented in Tables 49.

Table 4 shows the relation between entropy and critical temperature. In Table 4, we can easily see that all the entropies show the highest positive correlation with critical temperature. The most significant regression models are shown as follows:

Figure 5 shows that all the points fall near the fitted line. From all entropies, the fifth geometric arithmetic entropy shows the best relation with critical temperature . So, the critical temperature can be predicted by using the fifth geometric arithmetic entropy.

Table 5 shows the relation between entropy and critical pressure , and all the entropies show the highest positive correlation with critical pressure. The most significant regression models are shown as follows:

From all the entropies, the second entropy shows the best relation with critical pressure , and all the points fall near the fitted line shown in Figure 6. So, the critical pressure can be predicted by using the second entropy.

Table 6 shows the relation between entropy and critical volume. From Table 6, we can easily see that all the entropies show the highest positive correlation with critical volume. The most significant regression models are shown as follows:

From all the entropies, the fifth entropy shows the best relation with critical volume , and all the points fall near the fitted line as can be seen in Figure 7. So, the critical temperature can be predicted by using the fifth ND entropy.

Table 7 shows the relation between entropy and Gibb’s energy of benzene derivatives. From Table 7, we can easily see that all the entropies show the highest positive correlation with Gibb’s energy .

Of all the entropies, the fifth ND entropy shows the best relation with Gibb’s energy , and all the points fall near the fitted line as seen in Figure 8. So, Gibb’s energy can be predicted by using the sixth entropy.

Table 8 shows the relation between entropy and . From Table 8, we can easily see that all the entropies show the highest positive correlation with critical volume. The most significant regression models are shown as follows:

From all the entropies, the sixth entropy shows the best relation with , and all the points fall near the fitted line as can be seen in Figure 9. So, the can be predicted by using the sixth entropy.

Table 9 shows the relation between entropy and molar refractivity of benzene derivatives. From Table 9, we can easily see that all the entropies show the highest positive correlation with molar refractivity .

Of all the entropies, the sixth entropy shows the best relation with molar refractivity , and all the points fall near the fitted line as seen in Figure 10. So, the molar refractivity can be predicted by using the sixth entropy.

Table 10 shows the relation between entropy and Henry’s law. From Table 10, we can easily see that all the entropies show the highest positive correlation with Henry’s law. The most significant regression models are shown as follows:

From all the entropies, the sixth entropy shows the best relation with Henry’s law , and all the points fall near the fitted line as can be seen in Figure 11. So, Henry’s law can be predicted by using the sixth entropy.

3. Concluding Remarks

Quantitative structure-property relationships (QSPRs) mathematically link physical or chemical properties with the structure of a molecule. Entropies are defined on molecular structures for a better understanding of the physical features [35] and chemical reactivity. In this paper, we developed the QSPR between entropy and the physical characteristics of benzene derivatives. Based on the linear regression model, we analyzed that critical temperature , critical pressure , and critical volume can be predicted through the fifth geometric arithmetic entropy, second entropy, and fifth entropy, respectively. Other remaining physical characteristics such as Gibb’s energy , , molar refractivity , and Henry’s law can be predicted by using the sixth entropy.

Data Availability

The data used to support the findings of this study are cited at relevant places within the text as references.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Authors’ Contributions

This work was equally contributed by all writers.