- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Abstract and Applied Analysis

Volume 2012 (2012), Article ID 647231, 18 pages

http://dx.doi.org/10.1155/2012/647231

## Global Robust Exponential Stability Analysis for Interval Neural Networks with Mixed Delays

Institute of Applied Mathematics, Shijiazhuang Mechanical Engineering College, Shijiazhuang 050003, China

Received 17 September 2012; Revised 21 November 2012; Accepted 21 November 2012

Academic Editor: Józef Banaś

Copyright © 2012 Yanke Du and Rui Xu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

A class of interval neural networks with time-varying delays and distributed delays is investigated. By employing *H*-matrix and *M*-matrix theory, homeomorphism techniques, Lyapunov functional method, and linear matrix inequality approach, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point to the neural networks are established and some previously published results are improved and generalized. Finally, some numerical examples are given to illustrate the effectiveness of the theoretical results.

#### 1. Introduction

In recent years, great attention has been paid to the neural networks due to their applications in many areas such as signal processing, associative memory, pattern recognition, parallel computation, and optimization. It should be pointed out that the successful applications heavily rely on the dynamic behaviors of neural networks. Stability, as one of the most important properties for neural networks, is crucially required when designing neural networks. For example, in order to solve problems in the fields of optimization, neural control, and signal processing, neural networks have to be designed such that there is only one equilibrium point, and it is globally asymptotically stable so as to avoid the risk of having spurious equilibria and local minima.

We should point out that neural networks have recently been implemented on electronic chips. In electronic implementation of neural networks, time delays are unavoidably encountered during the processing and transmission of signals, which can cause oscillation and instability of a neural network. On the other hand, there exist inevitably some uncertainties caused by the existence of modeling errors, external disturbance, and parameter fluctuation, which would lead to complex dynamic behaviors. Thus, a good neural network should have robustness against such uncertainties. If the uncertainties of a system are due to the deviations and perturbations of parameters and if these deviations and perturbations are assumed to be bounded, then this system is called an interval system. Recently, global robust stability of interval neural networks with time delays are widely investigated (see [1–22] and references therein). In particular, Faydasicok and Arik [3, 4] proposed two criteria for the global asymptotical robust stability to a class of neural networks with constant delays by utilizing the Lyapunov stability theorems and homeomorphism theorem. The obtained conditions are independent of time delays and only rely on the network parameters of the neural system. Employing Lyapunov-Krasovskii functionals, Balasubramaniam et al. [10, 11] derived two passivity criteria for interval neural networks with time-varying delays in terms of linear matrix inequalities (LMI), which are dependent on the size of the time delays. In practice, to achieve fast response, it is often expected that the designed neural networks can converge fast enough. Thus, it is not only theoretically interesting but also practically important to establish some sufficient conditions for global robust exponential stability of neural networks. In [8], Zhao and Zhu established some sufficient conditions for the global robust exponential stability for interval neural networks with constant delays. In [18], Wang et al. obtained some criteria for the global robust exponential stability for interval Cohen-Grossberg neural networks with time-varying delays using LMI, matrix inequality, matrix norm, and Halanay inequality techniques. In [15–17], employing homeomorphism techniques, Lyapunov method, *H*-matrix and *M*-matrix theory, and LMI approach, Shao et al. established some sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for interval Hopfield neural networks with time-varying delays. Recently, the stability of neural networks with time-varying delays has been extensively investigated and various sufficient conditions have been established for the global asymptotic and exponential stability in [10–27]. Generally, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. It is desired to model them by introducing continuously distributed delays over a certain duration of time such that the distant past has less influence compared to the recent behavior of the state (see [28–30]). However, the distributed delays were not taken into account in [15–17] and most of the above references. To the best of our knowledge, there are fewer robust stability results about the interval neural networks with both the time-varying delays and the distributed delays (see [21, 22]).

Motivated by the works of [15–17] and the discussions above, the purpose of this paper is to present some new sufficient conditions for the global robust exponential stability of neural networks with time-varying and distributed delays. The obtained results can be easily checked. Comparisons are made with some previous works by some remarks and numerical examples, which show that our results effectually improve and generalize some existing works. The neural network can be described by the following differential equations: or equivalently where denotes the state vector associated with the neurons, is a positive diagonal matrix, and , , and are the interconnection weight matrix and the time-varying delayed interconnection weight matrix and the distributed delayed interconnection weight matrix, respectively. , and denotes the activation function, denotes the time-varying delay associated with the th neuron, , , represents the delay kernel function, which is a real-valued continuous function defined in satisfying . is the constant input vector. The coefficients , , , and can be intervalised as follows: where , , , , and . Denote and . Clearly, is a nonnegative matrix and the interval matrix . Consequently, , . and are defined correspondingly.

Throughout this paper, we make the following assumptions.(H1) are Lipschitz continuous and monotonically nondecreasing, that is, there exist constants such that (H2) are bounded differential functions of time and satisfy .

Denote , , , and .

The organization of this paper is as follows. In Section 2, some preliminaries are given. In Section 3, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for system (1.1) are presented. In Section 4, some numerical examples are provided to illustrate the effectiveness of the obtained results and comparisons are made between our results and the previously published ones. A concluding remark is given in Section 5 to end this work.

#### 2. Preliminaries

We give some preliminaries in this section. For a vector , . For a matrix , denotes the transpose; denotes the inverse; means that is a symmetric positive definite (semidefinite) matrix; and denote the largest and the smallest eigenvalues of , respectively; denotes the spectral norm of . denotes the identity matrix. denotes the symmetric block in a symmetric matrix.

*Definition 2.1 (see [20]). *The neural network (1.1) with the parameter ranges defined by (1.3) is globally robustly exponentially stable if for each , , , , and , system (1.1) has a unique equilibrium point , and there exist constants and such that
where is a solution of system (1.1) with the initial value , .

*Definition 2.2 (see [31]). *Let if , , where denotes the set of all matrices with entries from . Then a matrix is called an *M*-matrix if and all successive principal minors of are positive.

*Definition 2.3 (see [31]). *An matrix is said to be an *H*-matrix if its comparison matrix is an *M*-matrix, where

Lemma 2.4 (see [19]). *For any vectors and positive definite matrix , the following inequality holds: .*

Lemma 2.5 (see [31]). *Let . If is an M -matrix and the elements of matrices , satisfy the inequalities , , then is an M-matrix.*

Lemma 2.6 (see [31]). *The following LMI: , where , , is equivalent to and or and .*

Lemma 2.7 (see [1]). * is a homeomorphism if satisfies the following conditions:*(1)* is injective, that is, , ; *(2)* is proper, that is, as . *

Lemma 2.8. *Suppose that the neural network parameters are defined by (1.3), and**
where , , and are positive diagonal matrices, , with
**
Then, for all , , and , we have
**
where .*

* Proof. *Denote
By Lemma 2.6, is equivalent to
Obviously, , and it follows by Definition 2.2 that is an *M*-matrix.

By Lemma 2.6, is equivalent to . Therefore, we need only to verify that . Noting that and , we have

Denote the comparison matrix of by , where
Considering , we can obtain that
It follows from (2.8) and (2.10) that . From Lemma 2.5, we deduce that is an *M*-matrix, that is, is an *H*-matrix with positive diagonal elements. It is well known that a symmetric *H*-matrix with positive diagonal entries is positive definite, then , which implies that for all , , and . The proof is complete.

#### 3. Global Robust Exponential Stability

In this section, we will give a new sufficient condition for the existence and uniqueness of the equilibrium point for system (1.1) and analyze the global robust exponential stability of the equilibrium point.

Theorem 3.1. *Under assumptions (H1) and (H2), if there exist positive diagonal matrices , , and such that , or equivalently , where and are defined by (2.3) and (2.7), respectively, then system (1.1) is globally robustly exponentially stable.*

* Proof. *We will prove the theorem in two steps.*Step 1.* We will prove the existence and uniqueness of the equilibrium point of system (1.1).

Define a map: . We will prove that is a homeomorphism of into itself.

First, we prove that is an injective map on . For , , we have
If , then for . If , multiplying both sides of (3.1) by , and utilizing Lemma 2.4, assumptions (H1), (H2) and the compatibility of vector 2-norm and matrix spectral norm, we deduce that
By Lemma 2.8, we have shown that if , which leads to
Therefore, for all , that is, is injective.

Next, we prove that as . Letting in (3.2), we get
It follows that
which yields
Since and are finite, it is obvious that as . On the other hand, for unbounded activation functions, by (H1), implies . For bounded activation functions, it is not difficult to derive from (3.1) that as .

By Lemma 2.7, we know that is a homeomorphism on . Thus, system (1.1) has a unique equilibrium point .*Step 2.* We prove that the unique equilibrium point is globally robustly exponentially stable.

Let ; one can transform system (1.1) into the following system:
or equivalently
where , , with .

We define a Lyapunov functional: , where
Calculating the derivative of along the trajectories of system (3.7), we obtain that
Therefore, one can deduce that
where
Denote , from Lemma 2.6, is equivalent to , where is defined by (2.5). By Lemma 2.8, we have . Letting , we can derive that
which yields . Choosing with
we can get
Consequently, for all .

On the other hand,
where . Hence, , that is,
Thus, system (1.1) is globally robustly exponentially stable. The proof is complete.

*Remark 3.2. *For the case of infinite distributed delays, that is, letting in (1.1), assume that the delay kernels satisfy (H3) and for some positive constant . A typical example of such delay kernels is given by for , where , , which are called the Gamma Memory Filter in [32]. From assumption (H3), we can choose a constant satisfying the following requirement:
In a similar argument as the proof of Theorem 3.1, Under the conditions of Theorem 3.1 and assumption (H3), we can derive that
where , . Hence, for the case of , system (1.1) is also globally robustly exponentially stable.

*Remark 3.3. *Letting be a positive scalar matrix in Theorem 3.1, we can get a robust exponential stability criterion based on LMI.

*Remark 3.4. *In [8, 13, 15–18], the authors have dealt with the robust exponential stability of neural networks with time-varying delays. However, the distributed delays were not taken into account in models. Therefore, our results in this paper are more general than those reported in [8, 13, 15–18]. It should be noted that the main results in [15] are a special case of Theorem 3.1 when . Also, our results generalize some previous ones in [2, 6, 7] as mentioned in [15].

*Remark 3.5. *In previous works such as [2, 6, 7, 17], is often used as a part to estimate the bounds for . Considering that is a nonnegative matrix, we develop a new approach based on *H*-matrix theory. The obtained robust stability criterion is in terms of the matrices and , which can reduce the conservativeness of the robust results to some extent.

#### 4. Numerical Simulations and Comparisons

In what follows, we give some examples to illustrate the results above and make comparisons between our results and the previously published ones.

*Example 4.1. *Consider system (1.1) with the following parameters:
It is clear that , , , . Using the optimization toolbox of Matlab and solving the optimization problem (2.3), we can obtain a feasible solution:
In this case,
By Theorem 3.1, system (1.1) with above parameters is globally robustly exponentially stable. To illustrate the theoretical result, we present a simulation with the following parameters:
We can find that the neuron vector converges to the unique equilibrium point (see Figure 1). Further, from (3.14) and (3.18), we can deduce that , , and . Thus the exponential convergence rate is .

Next, we will compare our results with the previously robust stability results derived in the literature. If , , is a constant, , system (1.1) reduces to the following interval neural networks: which was studied in [3, 4, 8] and the main results are restated as follows.

Theorem 4.2 (see [3]). *Let . Then neural network model (4.5) is globally asymptotically robustly stable, if the following condition holds:
**
where , , , , and with .*

Theorem 4.3 (see [4]). *For the neural network defined by (4.5), assume that . Then, neural network model (4.5) is globally asymptotically stable if the following condition holds:
**
where , , ,
**
with , , with , with and . *

Theorem 4.4 (see [8]). *Under assumption (H1), if there exists a positive definite diagonal matrix , such that
**
where is defined in Lemma 2.5, then system (4.5) is globally robustly exponentially stable. *

*Example 4.5. *In system (4.5), we choose
It is clear that . Solving the optimization problem (2.3), we obtain
By Theorem 3.1, system (4.5) is globally robustly exponentially stable. To illustrate the theoretical result, we present a simulation with , , and . We can find that the neuron vector converges to the unique equilibrium point (see Figure 2).

Now, applying the result of Theorem 4.2 to this example yields The choice ensures the global robust stability of system (4.5).

If we apply the result of Theorem 4.3 to this example and choose , which can guarantee reaches the minimum value when , then we have from which we can obtain that system (4.5) is globally robustly exponentially stable for .

In this example, noting that system (4.5) is globally robustly exponentially stable for in our result, hence, for the network parameters of this example, our result derived in Theorem 3.1 imposes a less restrictive condition on than those imposed by Theorems 4.2 and 4.3.

*Example 4.6. *In system (4.5), we choose
Using Theorem 4.4, one can obtain
Clearly, there do not exist suitable positive constants and such that . As a result, Theorem 4.4 cannot be applied to this example.

Solving the optimization problem (2.3), we obtain that By Theorem 3.1, system (4.5) is globally robustly exponentially stable.

#### 5. Conclusion

In this paper, we discussed a class of interval neural networks with time-varying delays and finite as well as infinite distributed delays. By employing *H*-matrix and *M*-matrix theory, homeomorphism techniques, Lyapunov functional method, and LMI approach, sufficient conditions for the existence, uniqueness, and global robust exponential stability of the equilibrium point for the neural networks were established. It was shown that the obtained results improve and generalize the previously published results. Therefore, our results extend the application domain of neural networks to a larger class of engineering problems. Numerical simulations demonstrated the main results. At last, in order to guide the readers to future works of robust stability of neural networks, we would like to point out that the key factor should be the determination of new upper bound norms for the intervalized connection matrices. Such new upper bound estimations for intervalized connection matrices might help us to derive new sufficient conditions for robust stability of delayed neural networks.

#### Acknowledgments

This work was supported by the National Natural Science Foundation of China (11071254) and the Science Foundation of Mechanical Engineering College (YJJXM11004).

#### References

- S. Arik, “Global robust stability analysis of neural networks with discrete time delays,”
*Chaos, Solitons and Fractals*, vol. 26, no. 5, pp. 1407–1414, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - T. Ensari and S. Arik, “New results for robust stability of dynamical neural networks with discrete time delays,”
*Expert Systems with Applications*, vol. 37, no. 8, pp. 5925–5930, 2010. View at Publisher · View at Google Scholar · View at Scopus - O. Faydasicok and S. Arik, “Further analysis of global robust stability of neural networks with multiple time delays,”
*Journal of the Franklin Institute*, vol. 349, no. 3, pp. 813–825, 2012. View at Publisher · View at Google Scholar - O. Faydasicok and S. Arik, “A new robust stability criterion for dynamical neural networks with multiple time delays,”
*Neurocomputing*, vol. 99, no. 1, pp. 290–297, 2013. - W. Han, Y. Liu, and L. Wang, “Robust exponential stability of Markovian jumping neural networks with mode-dependent delay,”
*Communications in Nonlinear Science and Numerical Simulation*, vol. 15, no. 9, pp. 2529–2535, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - N. Ozcan and S. Arik, “Global robust stability analysis of neural networks with multiple time delays,”
*IEEE Transactions on Circuits and Systems I*, vol. 53, no. 1, pp. 166–176, 2006. View at Publisher · View at Google Scholar - V. Singh, “Improved global robust stability criterion for delayed neural networks,”
*Chaos, Solitons and Fractals*, vol. 31, no. 1, pp. 224–229, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - W. Zhao and Q. Zhu, “New results of global robust exponential stability of neural networks with delays,”
*Nonlinear Analysis: Real World Applications*, vol. 11, no. 2, pp. 1190–1197, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - F. Wang and H. Wu, “Mean square exponential stability and periodic solutions of stochastic interval neural networks with mixed time delays,”
*Neurocomputing*, vol. 73, no. 16–18, pp. 3256–3263, 2010. View at Publisher · View at Google Scholar · View at Scopus - P. Balasubramaniam and G. Nagamani, “A delay decomposition approach to delay-dependent passivity analysis for interval neural networks with time-varying delay,”
*Neurocomputing*, vol. 74, no. 10, pp. 1646–1653, 2011. View at Publisher · View at Google Scholar · View at Scopus - P. Balasubramaniam, G. Nagamani, and R. Rakkiyappan, “Global passivity analysis of interval neural networks with discrete and distributed delays of neutral type,”
*Neural Processing Letters*, vol. 32, no. 2, pp. 109–130, 2010. View at Publisher · View at Google Scholar · View at Scopus - P. Balasubramaniam and M. S. Ali, “Robust exponential stability of uncertain fuzzy Cohen-Grossberg neural networks with time-varying delays,”
*Fuzzy Sets and Systems*, vol. 161, no. 4, pp. 608–618, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - O. M. Kwon, S. M. Lee, and J. H. Park, “Improved delay-dependent exponential stability for uncertain stochastic neural networks with time-varying delays,”
*Physics Letters A*, vol. 374, no. 10, pp. 1232–1241, 2010. View at Publisher · View at Google Scholar · View at Scopus - S. Lakshmanan, A. Manivannan, and P. Balasubramaniam, “Delay-distribution-dependent stability criteria for neural networks with time-varying delays,”
*Dynamics of Continuous, Discrete and Impulsive Systems A*, vol. 19, no. 1, pp. 1–14, 2012. - J. L. Shao, T. Z. Huang, and X. P. Wang, “Improved global robust exponential stability criteria for interval neural networks with time-varying delays,”
*Expert Systems with Applications*, vol. 38, no. 12, pp. 15587–15593, 2011. View at Publisher · View at Google Scholar · View at Scopus - J. L. Shao, T. Z. Huang, and S. Zhou, “An analysis on global robust exponential stability of neural networks with time-varying delays,”
*Neurocomputing*, vol. 72, no. 7–9, pp. 1993–1998, 2009. View at Publisher · View at Google Scholar · View at Scopus - J.-L. Shao, T.-Z. Huang, and S. Zhou, “Some improved criteria for global robust exponential stability of neural networks with time-varying delays,”
*Communications in Nonlinear Science and Numerical Simulation*, vol. 15, no. 12, pp. 3782–3794, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - Z. Wang, H. Zhang, and W. Yu, “Robust stability criteria for interval Cohen-Grossberg neural networks with time varying delay,”
*Neurocomputing*, vol. 72, no. 4–6, pp. 1105–1110, 2009. View at Publisher · View at Google Scholar · View at Scopus - H. Zhang, Z. Wang, and D. Liu, “Robust exponential stability of recurrent neural networks with multiple time-varying delays,”
*IEEE Transactions on Circuits and Systems II*, vol. 54, no. 8, pp. 730–734, 2007. View at Publisher · View at Google Scholar · View at Scopus - J. Zhang, “Global exponential stability of interval neural networks with variable delays,”
*Applied Mathematics Letters*, vol. 19, no. 11, pp. 1222–1227, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - X. Li, “Global robust stability for stochastic interval neural networks with continuously distributed delays of neutral type,”
*Applied Mathematics and Computation*, vol. 215, no. 12, pp. 4370–4384, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - W. Su and Y. Chen, “Global robust stability criteria of stochastic Cohen-Grossberg neural networks with discrete and distributed time-varying delays,”
*Communications in Nonlinear Science and Numerical Simulation*, vol. 14, no. 2, pp. 520–528, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - H. Liu, Y. Ou, J. Hu, and T. Liu, “Delay-dependent stability analysis for continuous-time BAM neural networks with Markovian jumping parameters,”
*Neural Networks*, vol. 23, no. 3, pp. 315–321, 2010. View at Publisher · View at Google Scholar · View at Scopus - J. Pan, X. Liu, and S. Zhong, “Stability criteria for impulsive reaction-diffusion Cohen-Grossberg neural networks with time-varying delays,”
*Mathematical and Computer Modelling*, vol. 51, no. 9-10, pp. 1037–1050, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - J. Tian and S. Zhong, “Improved delay-dependent stability criterion for neural networks with time-varying delay,”
*Applied Mathematics and Computation*, vol. 217, no. 24, pp. 10278–10288, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - H. Wang, Q. Song, and C. Duan, “LMI criteria on exponential stability of BAM neural networks with both time-varying delays and general activation functions,”
*Mathematics and Computers in Simulation*, vol. 81, no. 4, pp. 837–850, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - X. Zhang, S. Wu, and K. Li, “Delay-dependent exponential stability for impulsive Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion terms,”
*Communications in Nonlinear Science and Numerical Simulation*, vol. 16, no. 3, pp. 1524–1532, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - K. Li, “Stability analysis for impulsive Cohen-Grossberg neural networks with time-varying delays and distributed delays,”
*Nonlinear Analysis: Real World Applications*, vol. 10, no. 5, pp. 2784–2798, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - X. Fu and X. Li, “LMI conditions for stability of impulsive stochastic Cohen-Grossberg neural networks with mixed delays,”
*Communications in Nonlinear Science and Numerical Simulation*, vol. 16, no. 1, pp. 435–454, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - B. Zhou, Q. Song, and H. Wang, “Global exponential stability of neural networks with discrete and distributed delays and general activation functions on time scales,”
*Neurocomputing*, vol. 74, no. 17, pp. 3142–3150, 2011. View at Publisher · View at Google Scholar · View at Scopus - R. A. Horn and C. R. Johnson,
*Topics in Matrix Analysis*, Cambridge University Press, Cambridge, UK, 1991. View at Publisher · View at Google Scholar - J. C. Principe, J. M. Kuo, and S. Celebi, “An analysis of the gamma memory in dynamic neural networks,”
*IEEE Transactions on Neural Networks*, vol. 5, no. 2, pp. 331–337, 1994. View at Publisher · View at Google Scholar · View at Scopus