Abstract

A class of interval neural networks with time-varying delays and distributed delays is investigated. By employing H-matrix and M-matrix theory, homeomorphism techniques, Lyapunov functional method, and linear matrix inequality approach, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point to the neural networks are established and some previously published results are improved and generalized. Finally, some numerical examples are given to illustrate the effectiveness of the theoretical results.

1. Introduction

In recent years, great attention has been paid to the neural networks due to their applications in many areas such as signal processing, associative memory, pattern recognition, parallel computation, and optimization. It should be pointed out that the successful applications heavily rely on the dynamic behaviors of neural networks. Stability, as one of the most important properties for neural networks, is crucially required when designing neural networks. For example, in order to solve problems in the fields of optimization, neural control, and signal processing, neural networks have to be designed such that there is only one equilibrium point, and it is globally asymptotically stable so as to avoid the risk of having spurious equilibria and local minima.

We should point out that neural networks have recently been implemented on electronic chips. In electronic implementation of neural networks, time delays are unavoidably encountered during the processing and transmission of signals, which can cause oscillation and instability of a neural network. On the other hand, there exist inevitably some uncertainties caused by the existence of modeling errors, external disturbance, and parameter fluctuation, which would lead to complex dynamic behaviors. Thus, a good neural network should have robustness against such uncertainties. If the uncertainties of a system are due to the deviations and perturbations of parameters and if these deviations and perturbations are assumed to be bounded, then this system is called an interval system. Recently, global robust stability of interval neural networks with time delays are widely investigated (see [122] and references therein). In particular, Faydasicok and Arik [3, 4] proposed two criteria for the global asymptotical robust stability to a class of neural networks with constant delays by utilizing the Lyapunov stability theorems and homeomorphism theorem. The obtained conditions are independent of time delays and only rely on the network parameters of the neural system. Employing Lyapunov-Krasovskii functionals, Balasubramaniam et al. [10, 11] derived two passivity criteria for interval neural networks with time-varying delays in terms of linear matrix inequalities (LMI), which are dependent on the size of the time delays. In practice, to achieve fast response, it is often expected that the designed neural networks can converge fast enough. Thus, it is not only theoretically interesting but also practically important to establish some sufficient conditions for global robust exponential stability of neural networks. In [8], Zhao and Zhu established some sufficient conditions for the global robust exponential stability for interval neural networks with constant delays. In [18], Wang et al. obtained some criteria for the global robust exponential stability for interval Cohen-Grossberg neural networks with time-varying delays using LMI, matrix inequality, matrix norm, and Halanay inequality techniques. In [1517], employing homeomorphism techniques, Lyapunov method, H-matrix and M-matrix theory, and LMI approach, Shao et al. established some sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for interval Hopfield neural networks with time-varying delays. Recently, the stability of neural networks with time-varying delays has been extensively investigated and various sufficient conditions have been established for the global asymptotic and exponential stability in [1027]. Generally, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. It is desired to model them by introducing continuously distributed delays over a certain duration of time such that the distant past has less influence compared to the recent behavior of the state (see [2830]). However, the distributed delays were not taken into account in [1517] and most of the above references. To the best of our knowledge, there are fewer robust stability results about the interval neural networks with both the time-varying delays and the distributed delays (see [21, 22]).

Motivated by the works of [1517] and the discussions above, the purpose of this paper is to present some new sufficient conditions for the global robust exponential stability of neural networks with time-varying and distributed delays. The obtained results can be easily checked. Comparisons are made with some previous works by some remarks and numerical examples, which show that our results effectually improve and generalize some existing works. The neural network can be described by the following differential equations: or equivalently where denotes the state vector associated with the neurons, is a positive diagonal matrix, and , , and are the interconnection weight matrix and the time-varying delayed interconnection weight matrix and the distributed delayed interconnection weight matrix, respectively. , and denotes the activation function, denotes the time-varying delay associated with the th neuron, , , represents the delay kernel function, which is a real-valued continuous function defined in satisfying . is the constant input vector. The coefficients , , , and can be intervalised as follows: where , , , , and . Denote and . Clearly, is a nonnegative matrix and the interval matrix . Consequently, , . and are defined correspondingly.

Throughout this paper, we make the following assumptions.(H1) are Lipschitz continuous and monotonically nondecreasing, that is, there exist constants such that (H2) are bounded differential functions of time and satisfy .

Denote , , , and .

The organization of this paper is as follows. In Section 2, some preliminaries are given. In Section 3, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for system (1.1) are presented. In Section 4, some numerical examples are provided to illustrate the effectiveness of the obtained results and comparisons are made between our results and the previously published ones. A concluding remark is given in Section 5 to end this work.

2. Preliminaries

We give some preliminaries in this section. For a vector , . For a matrix , denotes the transpose; denotes the inverse; means that is a symmetric positive definite (semidefinite) matrix; and denote the largest and the smallest eigenvalues of , respectively; denotes the spectral norm of . denotes the identity matrix. denotes the symmetric block in a symmetric matrix.

Definition 2.1 (see [20]). The neural network (1.1) with the parameter ranges defined by (1.3) is globally robustly exponentially stable if for each , , , , and , system (1.1) has a unique equilibrium point , and there exist constants and such that where is a solution of system (1.1) with the initial value , .

Definition 2.2 (see [31]). Let if , , where denotes the set of all matrices with entries from . Then a matrix is called an M-matrix if and all successive principal minors of are positive.

Definition 2.3 (see [31]). An matrix is said to be an H-matrix if its comparison matrix is an M-matrix, where

Lemma 2.4 (see [19]). For any vectors and positive definite matrix , the following inequality holds: .

Lemma 2.5 (see [31]). Let . If is an M -matrix and the elements of matrices , satisfy the inequalities , , then is an M-matrix.

Lemma 2.6 (see [31]). The following LMI: , where , , is equivalent to and or and .

Lemma 2.7 (see [1]). is a homeomorphism if satisfies the following conditions:(1) is injective, that is, , ; (2) is proper, that is, as .

Lemma 2.8. Suppose that the neural network parameters are defined by (1.3), and where , , and are positive diagonal matrices, , with Then, for all , , and , we have where .

Proof. Denote By Lemma 2.6, is equivalent to Obviously, , and it follows by Definition 2.2 that is an M-matrix.
By Lemma 2.6, is equivalent to . Therefore, we need only to verify that . Noting that and , we have
Denote the comparison matrix of by , where Considering , we can obtain that It follows from (2.8) and (2.10) that . From Lemma 2.5, we deduce that is an M-matrix, that is, is an H-matrix with positive diagonal elements. It is well known that a symmetric H-matrix with positive diagonal entries is positive definite, then , which implies that for all , , and . The proof is complete.

3. Global Robust Exponential Stability

In this section, we will give a new sufficient condition for the existence and uniqueness of the equilibrium point for system (1.1) and analyze the global robust exponential stability of the equilibrium point.

Theorem 3.1. Under assumptions (H1) and (H2), if there exist positive diagonal matrices , , and such that , or equivalently , where and are defined by (2.3) and (2.7), respectively, then system (1.1) is globally robustly exponentially stable.

Proof. We will prove the theorem in two steps.
Step  1. We will prove the existence and uniqueness of the equilibrium point of system (1.1).
Define a map: . We will prove that is a homeomorphism of into itself.
First, we prove that is an injective map on . For , , we have If , then for . If , multiplying both sides of (3.1) by , and utilizing Lemma 2.4, assumptions (H1), (H2) and the compatibility of vector 2-norm and matrix spectral norm, we deduce that By Lemma 2.8, we have shown that if , which leads to Therefore, for all , that is, is injective.
Next, we prove that as . Letting in (3.2), we get It follows that which yields Since and are finite, it is obvious that as . On the other hand, for unbounded activation functions, by (H1), implies . For bounded activation functions, it is not difficult to derive from (3.1) that as .
By Lemma 2.7, we know that is a homeomorphism on . Thus, system (1.1) has a unique equilibrium point .
Step  2. We prove that the unique equilibrium point is globally robustly exponentially stable.
Let ; one can transform system (1.1) into the following system: or equivalently where , , with .
We define a Lyapunov functional: , where Calculating the derivative of along the trajectories of system (3.7), we obtain that Therefore, one can deduce that where Denote , from Lemma 2.6, is equivalent to , where is defined by (2.5). By Lemma 2.8, we have . Letting , we can derive that which yields . Choosing with we can get Consequently, for all .
On the other hand, where . Hence, , that is, Thus, system (1.1) is globally robustly exponentially stable. The proof is complete.

Remark 3.2. For the case of infinite distributed delays, that is, letting in (1.1), assume that the delay kernels satisfy (H3) and for some positive constant . A typical example of such delay kernels is given by for , where , , which are called the Gamma Memory Filter in [32]. From assumption (H3), we can choose a constant satisfying the following requirement: In a similar argument as the proof of Theorem 3.1, Under the conditions of Theorem 3.1 and assumption (H3), we can derive that where , . Hence, for the case of , system (1.1) is also globally robustly exponentially stable.

Remark 3.3. Letting be a positive scalar matrix in Theorem 3.1, we can get a robust exponential stability criterion based on LMI.

Remark 3.4. In [8, 13, 1518], the authors have dealt with the robust exponential stability of neural networks with time-varying delays. However, the distributed delays were not taken into account in models. Therefore, our results in this paper are more general than those reported in [8, 13, 1518]. It should be noted that the main results in [15] are a special case of Theorem 3.1 when . Also, our results generalize some previous ones in [2, 6, 7] as mentioned in [15].

Remark 3.5. In previous works such as [2, 6, 7, 17], is often used as a part to estimate the bounds for . Considering that is a nonnegative matrix, we develop a new approach based on H-matrix theory. The obtained robust stability criterion is in terms of the matrices and , which can reduce the conservativeness of the robust results to some extent.

4. Numerical Simulations and Comparisons

In what follows, we give some examples to illustrate the results above and make comparisons between our results and the previously published ones.

Example 4.1. Consider system (1.1) with the following parameters: It is clear that , , , . Using the optimization toolbox of Matlab and solving the optimization problem (2.3), we can obtain a feasible solution: In this case, By Theorem 3.1, system (1.1) with above parameters is globally robustly exponentially stable. To illustrate the theoretical result, we present a simulation with the following parameters: We can find that the neuron vector converges to the unique equilibrium point (see Figure 1). Further, from (3.14) and (3.18), we can deduce that , , and . Thus the exponential convergence rate is .

Next, we will compare our results with the previously robust stability results derived in the literature. If , , is a constant, , system (1.1) reduces to the following interval neural networks: which was studied in [3, 4, 8] and the main results are restated as follows.

Theorem 4.2 (see [3]). Let . Then neural network model (4.5) is globally asymptotically robustly stable, if the following condition holds: where , , , , and with .

Theorem 4.3 (see [4]). For the neural network defined by (4.5), assume that . Then, neural network model (4.5) is globally asymptotically stable if the following condition holds: where , , , with , , with , with and .

Theorem 4.4 (see [8]). Under assumption (H1), if there exists a positive definite diagonal matrix , such that where is defined in Lemma 2.5, then system (4.5) is globally robustly exponentially stable.

Example 4.5. In system (4.5), we choose It is clear that . Solving the optimization problem (2.3), we obtain By Theorem 3.1, system (4.5) is globally robustly exponentially stable. To illustrate the theoretical result, we present a simulation with , , and . We can find that the neuron vector converges to the unique equilibrium point (see Figure 2).

Now, applying the result of Theorem 4.2 to this example yields The choice ensures the global robust stability of system (4.5).

If we apply the result of Theorem 4.3 to this example and choose , which can guarantee reaches the minimum value when , then we have from which we can obtain that system (4.5) is globally robustly exponentially stable for .

In this example, noting that system (4.5) is globally robustly exponentially stable for in our result, hence, for the network parameters of this example, our result derived in Theorem 3.1 imposes a less restrictive condition on than those imposed by Theorems 4.2 and 4.3.

Example 4.6. In system (4.5), we choose Using Theorem 4.4, one can obtain Clearly, there do not exist suitable positive constants and such that . As a result, Theorem 4.4 cannot be applied to this example.

Solving the optimization problem (2.3), we obtain that By Theorem 3.1, system (4.5) is globally robustly exponentially stable.

5. Conclusion

In this paper, we discussed a class of interval neural networks with time-varying delays and finite as well as infinite distributed delays. By employing H-matrix and M-matrix theory, homeomorphism techniques, Lyapunov functional method, and LMI approach, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for the neural networks were established. It was shown that the obtained results improve and generalize the previously published results. Therefore, our results extend the application domain of neural networks to a larger class of engineering problems. Numerical simulations demonstrated the main results. At last, in order to guide the readers to future works of robust stability of neural networks, we would like to point out that the key factor should be the determination of new upper bound norms for the intervalized connection matrices. Such new upper bound estimations for intervalized connection matrices might help us to derive new sufficient conditions for robust stability of delayed neural networks.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (11071254) and the Science Foundation of Mechanical Engineering College (YJJXM11004).