Abstract

Chemical Graph entropy plays a significant role to measure the complexity of chemical structures. It has explicit chemical uses in chemistry, biology, and information sciences. A molecular structure of a compound consists of many atoms. Especially, the hydrocarbons is a chemical compound that consists of carbon and hydrogen atoms. In this article, we discussed the concept of subdivision of chemical graphs and their corresponding line chemical graphs. More preciously, we discuss the properties of chemical graph entropies and then constructed the chemical structures namely triangular benzenoid, hexagonal parallelogram, and zigzag edge coronoid fused with starphene. Also, we estimated the degree-based entropies with the help of line graphs of the subdivision of above mentioned chemical graphs.

1. Introduction

Mathematical chemistry is a field of theoretical chemistry that uses mathematical approaches to discuss molecular structure without necessarily referring to quantum mechanics [1]. Chemical Graph Theory is a branch of mathematical chemistry where a chemical phenomenon is theoretically described using graph theory [2, 3]. The growth of organic disciplines has been aided by Chemical Graph Theory [4, 5]. In mathematical chemistry, graph invariants or topological indices are numeric quantities that describe various essential features of organic components and are produced from an analogous molecular graph [6, 7]. Degree-based indices are among the topological indices used to predict bioactivity, boiling point, draining energy, stability, and physico-chemical properties of certain chemical compounds [8, 9]. Due to their chemical applications, these indices have significant role in theoretical chemistry. Zhang et al. [1012] discuss the topological indices of generalized bridge molecular graphs, Carbon Nanotubes and product of chemical graphs. Zhang et al. [1315] provided the physical analysis of heat for formation and entropy of Ceria Oxide. For further study about indices, see [16, 17].Shannon [18] originated the conception of information entropy in communication theory. However, it was later discovered as a quantity that applied to all things with a set nature [19, 20], including molecular graphs [2123]. In chemistry, information entropy is now used in two modes. Firstly, it is a structural descriptor for assessing the complexity of chemical structures [24]. Information entropy is useful in this regard for connecting structural and physico-chemical features [25], numerically distinguishing isomers of organic molecules [26], and classifying natural products and synthetic chemicals [27, 28]. The physico-chemical sounding of information entropy is a different mode of application. As a result, Terenteva and Kobozev demonstrated its utility in analyzing physico-chemical processes that simulate information transmission [29]. Zhdanov [30] used entropy values to study organic compound chemical processes. The information entropy is defined as:Here, the logarithm is considered to be with base while , and represent the vertex set, the edge set and the edge weight of the edge in . Many graph entropies have been calculated in the literature utilising characteristic polynomials, vertices degree, and graph order [3134]. Graph entropies, which are based on independent sets, matchings, and the degree of vertices [35], have been estimated in recent years. Dehmer and Mowshowits proposed several graph complexity and Hosoya entropy relationships [23, 32, 36, 37]. For further study, see [19, 21, 3842, 59, 60].The graph is structured into ordered pairs, with one object being referred to as a vertex set and the other as an edge set , and these vertices and edges being connected. When two vertices of share an edge, they are said to be neighboring. The sum of the degrees of all neighboring vertices of is denoted by , and the degree of a vertex is represented by . By replacing each of ’s edges with a path of length two, the subdivision graph is formed. The line graph is denoted by the symbol in which and two vertices of are adjacent iff their corresponding edges share a common end points in .

1.1. Randić Entropy [43, 44]

If , then

Now (1) represent the Randi Entropy.

1.2. Atom Bond Connectivity Entropy [45]

If , then

Thus (1) is converted in the following form:

1.3. The Geometric Arithmetic Entropy [43, 44]

If , then

Now (1) takes the form as given below.

1.4. The Fourth Atom Bond Connectivity Entropy [35]

If , then

Now (1) converted in the following form as:

1.5. The Fifth Geometric Arithmetic Entropy [35]

If , then

Equation (1) is now changed to the following form, which is known as fifth geometric arithmetic entropy.

See [35, 44] for further information on these entropy measures.

2. Formation of Triangular Benzenoid

Triangular benzenoids are a group of benzenoid molecular graphs and are denoted by , where characterizes the number of hexagons at the bottom of the graph and represents the total number of hexagons in . Triangular benzenoids are a generalization of the benzene molecule , with benzene rings forming a triangular shape. In physics, chemistry, and nanosciences, the benzene molecule is a common molecule. Synthesizing aromatic chemicals is quite fruitful [46]. Raut [47] calculated some toplogical indices for the triangular benzenoid system. Hussain et al. [48] discussed the irregularity determinants of some benzenoid systems. Kwun [49] calculated degree-based indices by using M polynomials. For further details, see [50, 51]. The hexagons are placed in rows, with each row increasing by one hexagon. For , there are only one type of edges and . Therefore, and while three kinds of edges are there in e.g. , , and , , . Therefore, and . Continuing in this way, and . The subdivision graph of and its line graph are demonstrated in Figure 1. It is to be noted that and .

Let . i-e. is the line graph of the subdivision graph of triangular benzenoid . We will use the edge partition and vertices counting technique to compute our abstracted indices and entropies. The degree of each edge’s terminal vertices is used in the edge partitioning of . It is easy to see that there are only three types of edges shown in Table 1.

2.1. Entropy Measure for

We’ll calculate the entropies of in this section.

2.1.1. Randi Entropy of

The Randi index and entropy for , with the help of Table 1, and equation (3) is:

By putting , in (3), we get the Randi entropies as given below:

2.1.2. The Entropy of

The index and entropy measure with the help of Table 1 and equation (5) is:

2.1.3. The Geometric Arithmetic Entropy of

The index and entropy measure with the help of Table 1 and equation (7) is:

2.1.4. The Entropy of

The edge partition of the graph is grounded on the degree addition of terminal vertices of every edge, as shown in Table 2.

After simple calculations, by using Table 2 subject to the condition that , we get

By using (9), the entropy as follows:

If we consider , Then , and .

2.1.5. The Entropy of

After some simple calculations, the index may be calculated using Table 2 under the constraint that .

Therefore, (11), with Table 2 converted in the form:

3. Formation of Hexagonal Parallelogram Nanotubes ,

Hexagonal parallelogram nanotubes are formed by arranging hexagons in a parallelogram fashion. Baig et al. [52] computed counting polynomials of benzoid carbon nanotubes. Also, see [53]. We will denote this structure by , in which and represent the quantity of hexagons in any row and column respectively. Also, the order and size of is and respectively. The subdivision graph of and its line graph is shown in Figure 2, see [46]. Let , then and . To compute our results, we will use edge partition technique which is grounded on the degree of terminal vertices of every edge. It is to be noted that there are only three types of edges, see Figure 2. The edge partition of chemical graph depending on the degree of terminal vertices is presented in Table 3.

3.1. Entropy Measure for

We will enumerate the entropies of in this section.

3.1.1. Randić Entropy of

The Randi index for , by using Table 3 is:

So the (3) with Table 3 gives the Randi entropy and is converted in the form:

Now substitute , in (20), we get the Randi entropies as given below:

3.1.2. The Entropy of

With the use of Table 3 and equation (5), we can calculate the index and entropy measure as follows:

Therefore, the equation (5), with Table 3 becomes as following and is called the atom bond connectivity entropy.

3.1.3. The Geometric Arithmetic Entropy of

We can calculate the index and entropy measure using Table 3 and equation (7) as follows:

3.1.4. The Entropy of

Case 1. when x > 1,
The edge partition of is shown in Table 4.
Therefore, the index and entropy measure with the help of Table 4 and equation (9) yield as:Since has seven kinds of edges, So (9) by using Table 4 is converted in the form:

Case 2. when ,
By using the same process, we get the closed expressions for the index and entropy as:

3.1.5. The Fifth Geometric Arithmetic Entropy of

Case 3. when x > 1, The fifth geometric arithmetic entropy can be estimated by using (11), and Table 4 in the following manner:So the (11), with Table 4 can be written as:

Case 4. when , By using Table 5 and using (11) we get the closed expressions for the index and entropy as:

4. Formation from Fusion of Zigzag-Edge Coronoid with Starphene Nanotubes

If a zigzag-edge coronoid is fused with a starphene , then we will obtain a composite benzenoid. It is to b noted that and . The subdivision graph of and its line graph are illustrated in Figure 3. We can see from figures that the order and the size in the line graph of the subdivision graph of are and respectively [46]. Let represents the subdivision graph of ’s line graph. The edge division is determined by the degree of each edge’s terminal vertices. Table 6 illustrates this.

4.1. Entropy Measure for

We’ll calculate the entropies of in this section.

4.1.1. Randi Entropy of

For , the Randi index with the help of Table 1 is

Using (3) Randi entropy is:

By putting , in (32), we get the Randi entropies as given below:

4.1.2. The Entropy of

The index and entropy measure with the help of Table 6 and equation (5) are:

4.1.3. The Geometric Arithmetic Entropy of

The index and corresponding entropy with the help of Table 6 and equation (7) are:

4.1.4. The entropy of

Table 7 shows the graph ’s edge partition, which is based on the degree addition of each edge’s terminal vertices.

After simple calculations, the index and entropy measure with the help of Table 7 and equation (9) subject to the condition that

4.1.5. The Entropy of

After some simple calculations, the index and corresponding entropy measure with the help of Table 7 and equation (11) subject to the condition that .

5. Concluding remarks for Computed Results

The applications of information-theoretic framework in many disciplines of study, such as biology, physics, engineering, and social sciences, have grown exponentially in the recent two decades. This phenomenal increase has been particularly impressive in the fields of soft computing, molecular biology, and information technology. As a result, the scientists may find our numerical and graphical results useful [54, 55]. The entropy function is monotonic, which means that as the size of a chemical structure increases, so does the entropy measure, and as the entropy of a system increases, so does the uncertainty regarding its reaction.

For , the numerical and graphical results are shown in Tables 8 and 9 and Figures 47. In Table 9, the fifth arithmetic geometric entropy is zero which shows that the process is deterministic for . When the chemical structure expands, the Randi entropy for develops more quickly than other entropy measurements of , whereas the Randi entropy for develops more slowly. This demonstrates that different topologies have varied entropy characteristics. For , the numerical and graphical results are shown in Tables 1013 and Figures 812. When the chemical structure expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of , whereas the entropy develops more slowly. Finally, for , the numerical and graphical results are shown in Table 14 and Figures 1316. When the chemical structure expands, the geometric arithmetic entropy develops more quickly than other entropy measurements of , whereas the Randi entropy for develops more slowly.

The novelty of this article is that entropies are computed for three types of benzenoid systems. These entropy measures are useful in estimating the heat of formation and many Physico-chemical properties. In statistical analysis of benzene structures, entropy measures showed more significant results as compared to topological indices. Therefore, we can say that the entropy measure is a newly introduced topological descriptor.

6. Conclusion

Using Shanon’s entropy and Chen et al. [31] entropy definitions, we generated graph entropies associated to a new information function in this research. Between indices and information entropies, a relationship is created. Using the line graph of the subdivision of these graphs, we estimated the entropies for triangular benzenoids , hexagonal parallelogram nanotubes, and . Thermodynamic entropy of enzyme-substrate complexions [57, 58] and configuration entropy of glass-forming liquids [56] are two examples of thermodynamic entropy employed in molecular dynamics studies of complex chemical systems. Similarly, using information entropy as a crucial structural criterion could be a new step in this direction.

Data Availability

The data used to support the findings of this study are cited at relevant places within the text as references.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Authors’ Contributions

This work was equally contributed by all writers.