Abstract

Graph theory is a discrete branch of mathematics for designing and predicting a network. Some topological invariants are mathematical tools for the analysis of connection properties of a particular network. The Cellular Neural Network (CNN) is a computer paradigm in the field of machine learning and computer science. In this article we have given a close expression to dominating invariants computed by the dominating degree for a cellular neural network. Moreover, we have also presented a 3D comparison between dominating invariants and classical degree-based indices to show that, in some cases, dominating invariants give a better correlation on the cellular neural network as compared to classical indices.

1. Introduction

In advance technology, computer science networking, electrical networks, and some biological networks have the maximum ability to send and transfer useful data and information in a very small amount of time with accuracy. With a rapid growth of networking science, many advanced and connected complex interconnection networks have been developed. Social networking, World Wide Web (WWW), ecological networking, genetic interconnection networks, and metabolic networks are such examples of complex advanced networks. In information technology, these fantastic and high-limit frameworks have become a need of time.

World Wide Web (WWW) is a framework between PC systems and the internal network that employs convection suites and known as Internet Protocol (IP). It is a planned to assimilate business, government, and social structure of neighbourhood extension. They are correlated and connected by electronic and optical systems of remote organization. The Internet passes a large amount of information among different people through social media, newspaper, electronic mail, and many applications such as skype and Google meet [1, 2].

Graph theory has a large number of applications in many areas of science such as engineering, computer science, computer networking, software engineering, and electrical and hardware engineering. An interconnection framework or network with a finite number of nodes (computer systems) and the links (connections) between them can be presented as a finite simple connected graph with a finite number of vertices and edges. Inspired by the topological descriptor’s ability to modify a chemical structure, many researchers have decided to apply it to the networking sciences [3].

Theoretical approaches of graph theory to the field of cheminformatics for describing topological properties of the oxide and silicate network are given in [47]. A naturally existing network of germanium phosphide and its topologies is discussed in [8]. A hexagon star network is comparatively described via valency-based topological descriptors in [9]. Ahmad et al. derived some degree-based polynomials for swapped networks in [10]. An embedded form of benzene ring in a p-type surface is topologically explained in [11].

2. Cellular Neural Network

A Cellular Neural Network or Cellular Nonlinear Network (CNN) is a computing paradigm in computer science and machine learning. It is very important in communication between neighbouring units. Solving Partial Differential Equations (PDEs), image processing, analyzing 3D surfaces, and reducing problems in geodesic maps and sensory-motor organs are some applications of the CNN.CNN processors are the systems of finite fixed topology, locally connected fixed location, and multiple inputs with a single output of nonlinear processing units. In the CNN processor, each cell (processor) has one output due to which it is communicated by other cells. The CNN processor was introduced by Leon Chua and Lin Yang in 1988. In the original Chua Yang CNN processor (CY-CNN), cells are weighted sum of different inputs whereas the output was a piecewise linear function [12].

Topologically, cells can be arranged on an infinite plane of a toroidal space. Some architecture topologies of the CNN are the Multiple-Neighbourhood-Size CNN (MNS-CNN), Multilayer CNN (ML-CNN), and single-Layer CNN (SL-CNN). Mathematically, the relationship between cells and its neighbours in the area of influence can be modeled by Dominating Invariants (DIs).A CNN network can be presented as an array of matrix in which each cell has feed-forward synapses which is the input and has feed-back synapses which is the output of the neighbourhood cell. Mathematically,where , , , , , , , , and is the sphere of influence to neighbourhood cells.

In the following, we are presenting a CNN in which a cell is locally coupled to , where

Here, each of these is the first-degree neighbour of [13, 14]. A depiction of the cellular neural network is shown in Figure 1, and its presentation as a simple connected graph on cells is given in Figure 2.

3. Dominating Degree and the Cellular Neural Network

A CNN can easily be depicted as a simple connected graph with a finite number of cells and links between them with no multiple edges. In a graph, the number of connections to a cell is called its degree. In context of graphs, all the vertices connected to its adjacent edges are called neighbourhood vertices. Excluding a vertex and considering its neighbourhood vertices gives open neighbourhood of the vertex while on including the vertex and counting all the edges adjacent to all vertices in neighbourhood gives us a closed neighbourhood of the vertex.Graphically, a vertex dominates all the edges in its close neighbourhood. So, here, we define the count of all such edges dominated by a vertex in its closed neighbourhood as the dominating degree of the vertex . In other words, the total number of different edges in the close neighbourhood of a vertex is its dominating degree. In the previous section, for the CNN, represents the open neighbourhood and all are its neighbouring cells for a cell . In the following, we show a CNN highlighting particularly with its open neighbourhood with all red-colored edges in Figure 3 and closed neighbourhood with all green edges in Figure 4.

On counting the number of connections in the closed neighbourhood of , we will state it as a dominating degree. Similarly, working on the same way on finding the dominating degree of each cell and generalizing the cellular neural network as a simple connected graph, we concluded that it has total cells and total connections in a graph. On calculating the dominating degree for each cell in a network with cells in an i-th row and cells in j-th column, we finalize a partition of dominating degrees of any two directly connected cells. So, with , we concluded that there are total 14 different types of connections based on dominating degrees of two directly connected cells. Here, we will describe the number of these connections differentiated on the basis of dominating degrees of two directly connected end cells. Say two cells and q are directly connected in a with ; then, their dominating degrees will be and , respectively. We will express two directly connected cells and as with dominating degrees as and . Therefore, we have

4. Dominating Invariants

Reti et al. referred neighbourhood-based topological invariants for first Zagreb and second Zagreb naming as neighbourhood first Zagreb index and neighbourhood second Zagreb index , respectively [15]. On the same lines, on relying definition of the dominating degree of a vertex, we have defined some Dominating Topological Invariants (DTIs). These invariants are computed on the basis of the dominating degree of the node associated to a network.

4.1. Dominating Randi’c Invariant

Taking any real number , dominating Randi’c invariant is computed as follows:

4.2. Dominating Geometric Invariant

In a simple connected network, the dominating geometric invariant is computed as follows [16]:

4.3. Dominating Atomic Bond Connectivity Invariant

In a simple connected network, the dominating atomic bond connectivity invariant is computed as follows [17]:

5. Main Results

This section includes our main results for the network on dominating invariants. In [18], Imran et al. explained some topological properties for Cellular Neural Networks based on classical degree-based invariants. They have derived some close results on Randi’c index, geometric index, and atomic bond connectivity index for the CNN.In this article, we have discussed some dominating invariants computed on dominating degrees in the cellular neural network and a 3D comparison between classical degree indices and dominating invariants.

5.1. Theorem

Consider Cellular Neural Network with as a simple connected graph, then its dominating Randi’c invariant for real values of is closely expressed as

Proof. By using the dominating degrees of directly connected cells and the computing definition of dominating Randi’c invariant for , we haveOn computing, we haveOn computing, we haveOn computing, we haveOn computing, we have

5.2. Corollary

In dominating Randi’c invariant, are also known as dominating second Zagreb and dominating modified second Zagreb invariants, respectively.

5.3. Theorem

Consider Cellular Neural Network with as a simple connected graph, then its dominating geometric invariant is closely expressed as

Proof. By using dominating degrees of for two directly connected cells and substituting the computed formula for , we get the desired result.

5.4. Theorem

Consider Cellular Neural Network with as a simple connected graph, then its dominating atomic bond connectivity invariant is closely expressed as

Proof. By using dominating degrees of for two directly connected cells and substituting the computed formula for , we get the desired result.
In light of all the abovementioned theorems, we state the following proposition basing on the fact that the dominating degree in a graph is closely related to closed neighbourhood of a vertex.

5.5. Proposition

Let be a simply connected graph with finite order and size; if is free from as its subgraph, then any network or structure isomorphic to will give equal topological results regarding neighbourhood indices and dominating invariants.

6. Graphical Comparison

In this section, we will provide a 3D graphical comparison between dominating invariants and classical degree-based indices.

Graphical comparison shows that, for (Figures 5 and 6), dominating Randi’c invariant is faster than classical degree-based Randi’c index and has better topological properties over the cellular neural networks as dominating invariants are computed on the dominating degree of a cell over its neighbouring cells. Similarly, the dominating geometric invariant is faster than the classical degree-based geometric index as shown in Figure 7. In Figures 810, Randi’c invariant for and the atomic bond connectivity invariant are compared, respectively.

7. Conclusions

According to the mathematical analysis and comparison carried out by graphical analysis, we concluded that the dominating-degree-based invariants have better predicting ability. Our results show that dominating-degree-based invariants show faster and more predicting ability for the cellular neural network. This may led to help far better analysis to predict topological properties of the cellular neural network in the future.

Data Availability

No data were used to support the study.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The authors extend their appreciation to the Deanship of Scientific Research at Majmaah University for funding this work under project number no. R-2021-120.