Abstract

This paper deals with the global asymptotic robust stability (GARS) of neural networks (NNs) with constant time delay via Frobenius norm. The Frobenius norm result has been utilized to find a new sufficient condition for the existence, uniqueness, and GARS of equilibrium point of the NNs. Some suitable Lyapunov functional and the slope bounded functions have been employed to find the new sufficient condition for GARS of NNs. Finally, we give some comparative study of numerical examples for explaining the advantageous of the proposed result along with the existing GARS results in terms of network parameters.

1. Introduction

Neural networks (NNs) operate on principles similar to the human nervous system. It has a huge number of processors. These types of processors operate in parallel and are organized in layers. The initiating layer takes raw input, similar to the raw information received by humans. All subsequent layers receive input from the layer before it. Again, pass the output to the next layer. Finally, the end layer sends the final output. Most nodes are interconnected in layers.

NNs have been studied by many of the researchers because of their applications in different fields. The modern technology is mostly based on computational models which are known as artificial neural networks (ANNs). Nowadays artificial intelligence plays the most important role in electrical and electronics world. ANNs are the backbone of this artificial intelligence. In recent years, the role of ANNs has been developed due to their applications in various disciplines. The machine learning uses the distinct variety of NNs such as Feedforward NNs—artificial neuron, radial basis function NNs, multilayer perceptron, convolutional NNs, recurrent neural networks (RNNs), modular NNs, and sequence-to-sequence models. Moreover, NNs have wide applications in engineering areas [16] such as radar systems, signal classification, 3D reconstruction, face identification, object recognition, medical diagnosis, visualization, machine translation, combinatorial optimization, and signal processing. Also they may have great applications in nonengineering areas such as sales forecasting, risk management, and target marketing.

In NNs, time delay plays an important role in different areas such as video lip reading and speech recognition. Due to this delay parameter, the convergence of solution of the given system can be affected. The convergence of solution of neural system is to make the NNs to be stable. So, the concept of global stability analysis plays an important role in the convergence of solution of NNs. Also that the different kinds of stability analysis such as global asymptotic robust stability (GARS), exponential stability, and complete stability of NNs, have been studied by many researchers in [714]. These different types of stability results of the delayed NNs have been discussed based on the methods of Lyapunov stability theory, linear matrix inequalities, nonsmooth analysis, and M-matrix theory in the previous literature. Therefore, the GARS analysis of NNs under parameter uncertainties is the most important problem. Recently, it has been exclusively studied by many authors in [1534].

From the motivation of the above concepts, GARS of NNs has been investigated in this paper. The objective of this paper is to obtain a new sufficient condition for the GARS of the equilibrium point of the delayed neural system using the Lyapunov stability theory and Frobenius norm under parameter uncertainties. The Frobenius norm is always an upper bound for the spectral norm (). Moreover, the Frobenius norm is much easier to compute than the spectral norm. For calculating the Frobenius norm, it is not necessary to find the eigenvalues rather by using the traces (sum of the diagonals of a matrix). So, the calculation of finding eigenvalues is pretty difficult for higher dimensional matrices. Moreover, some matrices having real entries may give complex eigenvalues. By utilizing the Frobenius norm, we avoid such situations. Therefore, the Frobenius norm is important for calculating the upper bound of connection weight matrices. By utilizing the concept of homeomorphism, we find a new sufficient condition for the existence and uniqueness of the equilibrium point of NNs. Finally, we will give some comparative studies of numerical example to illustrate the effectiveness of our results for the NNs.

We apply the following notations for the norm of vectors and matrices: let . The most common vector norms which are defined as follows: , , and . Let . The following are the definitions for , , and . , , , and . For any vector , is defined as . For any matrix with real entries will be defined as . The minimum and maximum eigenvalues of are denoted by and , respectively. be the trace of the matrix S. That is, the sum of the diagonal values of the matrix S. If is a symmetric matrix and , for any real vector , then is said to be positive definite matrix (positive semidefinite matrix). Consider the two positive definite matrices and . Then, implies that for any real vector .

2. Problem Statement and Fundamentals

In this paper, we consider the following delayed neural networks:where and denotes the total neurons. denotes the neuron state of the vector at time . represents the rate of charge for the neuron. and are the connection weight matrices with and without time delay, respectively. denotes the activation functions at time and . Here denotes the constant time delay. represents the vector with constant input between the neurons.

The matrix vector form of equation (1) is as follows:where , , , , , and , . The initial condition is , and denotes all continuous functions from to .

The most common approach for handling the delayed neural system is to make the connection weight matrices and , and the matrix in an interval as follows:

Assumption 1. (see [24]). The activation functions are assumed to be slope bounded; that is, there exist some positive constants such that the following conditions hold:This class of functions will be denoted by . The functions of this class do not require to be bounded, differentiable, and monotonically increasing.

Lemma 1. (see [29]). Let . If be the matrix defined as in equation (3), then for any positive diagonal matrix and a nonnegative diagonal matrix , the following inequality holds:where and .

Lemma 2 (see [24]). If ( means that the set of continuous functions on ) satisfies the following conditions, then is a homeomorphism on :(i) is an injective mapping on (ii) as

3. Existence and Uniqueness of Equilibrium Point

This section will focus on the new sufficient condition for the existence of equilibrium point of our model (2) which is unique. By using the Frobenius norm, we prove a new sufficient condition for existence of equilibrium point of our NNs model (2) which is unique.

Theorem 1. Suppose that and there exist matrices , and such thatwhere , and with . Then, for each constant vector , the neural network model (2) satisfying (3) has a unique equilibrium point.

Proof. Define the following map: is equivalent to . Here every solution of is an equilibrium point of system (2). For proving this theorem, it is enough to prove that is a homeomorphism on .
Let be the two vectors such that .
Then,If , then, or . First, let us consider and . From equation (8), we haveSince and . Therefore, from equation (9), . Next, let us consider and and multiply (9) by ; we getThen, we getFrom Lemma 1, we getApplying the results (11)–(13) in (10), we getGiven that , we havewhere is the smallest eigen value of the positive definite matrix . If and , then it follows from (14) that .
Therefore, for all . Hence G is injective on . Now, we prove that as . For this, let in (15) which implies thatFrom the above inequality, we writeBy applying the properties of norm for the above inequalities, we haveUsing the above inequalities in (20), we have the following inequality:where , , and are finite. Moreover, as or equivalently . Therefore, from the result of Lemma 2, for each constant vector , the neural networks model (2) has a unique equilibrium point. Hence, the proof.

4. Global Stability Analysis

In this section, we prove that the obtained sufficient conditions for the existence and uniqueness of the equilibrium point in the previous Theorem 1 will also give the sufficient conditions for the GARS of the neural system (2). Furthermore, we denote the equilibrium point of 1 by and use some proper transformation say . After giving such transformation, the network model (1) can be put in the following form:where Moreover, Assumption 1 holds for the function , i.e., gives that with . By using this transformation, the equilibrium point of 2 is shifted to the origin of (20).

Now, our focus is to show that GARS for the origin of the transformed model (20) instead of focusing the GARS for .

The matrix form of (20) is as follows:where is the new state vector, , and .

Theorem 2. Suppose that and there exist matrices , and , such thatwhere with . Then, origin of the neural network model (21) satisfying the network parameters 3 is globally asymptotically robust stable.

Proof. Consider the following positive definite Lyapunov functional:where , , , and are the positive constants which will be determined later. The following equation is the time derivative of equation (23) along the trajectories of the model (21):Also,By applying equations (25)–(28) in (24), we haveSince .
Therefore,By taking and , can be written in the following form:Using the result of Lemma 2, we writeThen, becomesSince , it follows from (33) thatIf we take , then it follows that , since gives that . Also if and , then can be written in the following form:Since , then we have . Therefore , . Finally, let us assume that and . Then, .
It is obvious that , . Hence, if and only if ; otherwise, . Moreover, is radially unbounded provided as . Hence, origin of system (21), or the equilibrium point of the neural system (2), is GARS.

5. Comparisons with Numerical Examples

In this section, we will compare our sufficient conditions of GARS with the previous existing results of GARS. For the comparison, the previous results of GARS are restated as follows.

Theorem 3. (see [24]). Suppose that and there exist matrices and such that

Then, origin of system (21) satisfying the network parameters 3 is globally asymptotically robust stable.

Theorem 4 (see [15]). Suppose that and there exist matrices and such that

Then origin of system (21) satisfying the network parameters 3 is globally asymptotically robust stable., where with and , for .Now we demonstrate the advantages of our result with some examples as follows.

Example 1. Consider the following network parameters of the neural network model 2:Let and . From the above matrices, we getUsing the above parameters, .
In this example, we compare our sufficient condition with the result by taking as an identity matrix. Now, are calculated as follows:, provided . For the sufficient condition , system 2 will become GARS whenever . For calculating , we need the matrix , and it is calculated using the matrices , , and . The matrix is given as follows:, provided . For the sufficient condition , system 2 will become GARS whenever . Consider the fixed network parameters , is a constant, and the activation function . The state trajectories are depicted in Figures 1 and 2 under the initial state and for the different initial states, respectively. The activation function , and the state trajectories are depicted in Figures 3 and 4 under the initial state and for the different initial states, respectively.

Remark 1. From the above Example 1, is valid for . Moreover, our result is valid in the range , but does not hold. Therefore, we conclude that is less conservative than for the network parameters of this example.

Example 2. Consider the neural network model 1 with the following network parameters:Let and . From the above matrices, we getUsing the above parameters, . In this example, we compare our sufficient condition with the result by taking as an identity matrix. Now, are calculated as follows:, provided . For the sufficient condition , system 2 will become GARS whenever . Now, is calculated as follows:, provided . For the sufficient condition , system 2 will become GARS whenever . Consider the fixed network parameters , is a constant, and the activation function . The state trajectories are depicted in Figures 5 and 6 under the initial state and for the different initial states, respectively. The activation function , and the state trajectories are depicted in Figures 7 and 8 under the initial state and for the different initial states, respectively.

Remark 2. From Example 2, is valid for . Moreover, our result is valid in the range , but does not hold. Therefore, we conclude that is less conservative than for the network parameters of this example.
From the above examples, our sufficient conditions is less conservative than those imposed by the results and . Hence, our sufficient condition is more advantageous than the previous results for the above network parameters. Our sufficient condition may have less advantage than the existing stability conditions for the different sets of network parameters. However, all such results will give sufficient conditions.

6. Conclusion

In this paper, global stability of NNs has been studied by using the Frobenius norm result under parameter uncertainties. Since the Frobenius norm is the easiest norm by its calculation when compared with spectral norm. By using this Frobenius norm, we have obtained a new sufficient condition for the GARS of NNs model 2. Finally, we discussed some numerical examples to illustrate the effectiveness of our result with the previous results.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The authors acknowledge King Mongkut’s University of Technology Thonburi (KMUTT) and Agricultural Research Development Agency (ARDA) (PRP6205030870) for their support. The authors are very grateful to Thailand Science Research and Innovation (TSRI) (RDG6230004) for its support.