- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Advances in Mathematical Physics

Volume 2013 (2013), Article ID 732406, 6 pages

http://dx.doi.org/10.1155/2013/732406

## LMI-Based Stability Criteria for Discrete-Time Neural Networks with Multiple Delays

^{1}School of Mathematics, Anhui University, Hefei 230039, China^{2}Department of Public Teaching, Anhui Business Vocational College, Hefei 230041, China

Received 17 March 2013; Accepted 26 May 2013

Academic Editor: Wen Xiu Ma

Copyright © 2013 Hui Xu and Ranchao Wu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Discrete neural models are of great importance in numerical simulations and practical implementations. In the current paper, a discrete model of continuous-time neural networks with variable and distributed delays is investigated. By Lyapunov stability theory and techniques such as linear matrix inequalities, sufficient conditions guaranteeing the existence and global exponential stability of the unique equilibrium point are obtained. Introduction of LMIs enables one to take into consideration the sign of connection weights. To show the effectiveness of the method, an illustrative example, along with numerical simulation, is presented.

#### 1. Introduction

During the past decades, various types of neural networks have been proposed and investigated intensively, since they play important roles and have found successful applications in fields such as pattern recognition, signal and image processing, nonlinear optimization problems, and parallel computation. The dynamical behaviors in neural models, such as the existence and their asymptotic stability of equilibria, periodic solutions, bifurcations, and chaos, have been the most active areas of research and have been extensively explored over the past years [1–22].

Due to the finite transmission speed of signals among neurons, time delays in interactions between neurons frequently happen and will cause complex dynamics in neural networks [6]; so it is necessary to introduce time delays into the neural models. So far, discrete, time-varying, and distributed delays have been, respectively, introduced to describe the dynamics of neural networks, and various sufficient conditions ensuring the stability have been given.

Note that in numerical simulation and practical implementations, discretization of continuous-time models is necessary and of great importance. On the other hand, the dynamics of discrete-time neural networks could be quite different from those of continuous versions and will display much more complicated behaviors. So it is of great theoretical and practical significance to study the dynamics of discrete neural models. For discrete models, such as discrete Hopfield, bidirectional associate memory, and Cohen-Grossberg neural networks, several authors [1, 7–22] have studied the existence and exponential stability of equilibria and periodic solutions.

In this paper, a discrete model with both variable and distributed time delays is introduced. By Lyapunov stability theory and linear matrix inequality (LMI) technique, sufficient conditions ensuring the existence and globally exponential stability of a unique equilibrium point are obtained. To show the effectiveness of our results, an illustrative example along with numerical simulation is presented. To our best knowledge, such general models have been seldom touched upon in the existing literatures. As we see, the obtained conditions are easy to verify. Furthermore, introduction of LMIs enables us to take into consideration the sign of connection weights. In contrast, sufficient conditions, for instance, in [7–12], depend on the absolute values of connection weights. That will ignore the differences between neuronal excitatory and inhibitory effects.

#### 2. Preliminaries

Set to be the set of integers and the set of nonnegative integers; let represent the set of integers between and with , namely, .

Consider the discrete-time neural networks with both variable and distributed delays: with initial values where are the states of the th neuron at time ; represents the rate with which the th neuron resets its potential when isolated from others; and weigh the strengths of the th unit on the th unit; and are the nonlinear activation functions of the neurons; denotes the transmission delay along the axon of the th unit; is the delay kernel; is the external input on the th neuron at time ; the initial value functions are bounded .

To investigate stability of system (1), make further assumptions:(H1) suppose that , , for , and for , with being constant;(H2) suppose that , , and , for some , all ;(H3) assume that functions and are bounded and satisfy for any , where , and are some constants and can be positive, negative, or zero, . So they are less restrictive than sigmoid activation functions and Lipschitz-type ones.

For any , a solution of systems (1) and (2) is a vector-valued function satisfying system (1) and initial conditions (2) for . In this paper, it is always assumed that neural model (1) admits a solution represented by or simply . Since the activation functions are bounded, it is not difficult to check that system (1) has at least one equilibrium point by Brouwer's fixed point theorem. So with loss of generality, assume that ; that is, is an equilibrium point. Throughout the paper, denote System (1) can be rewritten into the form

*Definition 1. *The equilibrium point of system (1) is globally exponentially stable if there exist constants and such that for any solution of system (1) with initial conditions , it holds that

#### 3. Exponential Stability of Equilibrium Points

By Lyapunov stability theory and LMI technique, the global exponential stability of the equilibrium point is established. Clearly, if is exponentially stable, the equilibrium point is unique. Now we will investigate the exponential stability of the origin.

Theorem 2. *Suppose that (H1)–(H3) hold and further there exist a number , positive definite matrix , , and semipositive diagonal matrices , , and , such that
**
where
**
Then the origin of system (1) is exponentially stable.*

*Proof. *Define a Lyapunov functional as follows:
where
To investigate the exponential stability of the origin, it is necessary to calculate the difference along the trajectory of (6). From (6), we have
where . Since , one obtains
Therefore, we have

From (H3), one has
Then for , , and , one has
where . From , it follows that . Note that
where are the minimum and maximum eigenvalues of matrix , respectively, , , and , so one has
where . This implies that the equilibrium solution of system (1) is globally exponentially stable. The proof is completed.

*Remark 3. *By employing LMI (8), the signs of , , that is, the differences between neural excitatory and inhibitory interaction, are taken into consideration.

*Remark 4. *If , the equilibrium point of system (1) is said to be globally stable.

#### 4. Numerical Example

Next, an illustrative example is given to show the effectiveness of the obtained results. Consider the discrete-time neural model (6) with parameters: then it is not difficult to see that , , , , and . Take , then by solving LMI (8), it has feasible solutions that are and so from Theorems 2, this system admits a unique equilibrium , with all other solutions converging to it exponentially as see Figures 1 and 2.

#### 5. Conclusions

In the current paper, a class of discrete-time neural networks with both variable and distributed delays has been studied. Using Lyapunov stability and LMI technique, the existence and global exponential stability of the unique equilibrium point have been established. The obtained results are easy to verify, so they will be of practical use for applying discrete neural models.

#### Acknowledgments

This study is supported by the Specialized Research Fund for the Doctoral Program of Higher Education of China (no. 20093401120001), the Natural Science Foundation of Anhui Province (no. 11040606M12), the Natural Science Foundation of Anhui Education Bureau (no. KJ2010A035), the 211 project of Anhui University (no. KJJQ1102).

#### References

- S. Mohamad and K. Gopalsamy, “Dynamics of a class of discrete-time neural networks and their continuous-time counterparts,”
*Mathematics and Computers in Simulation*, vol. 53, no. 1-2, pp. 1–39, 2000. View at Google Scholar · View at Scopus - S. Arik, “An improved global stability result for delayed cellular neural network,”
*IEEE Transactions on Circuits and Systems*, vol. 49, no. 8, pp. 1211–1214, 2002. View at Publisher · View at Google Scholar · View at Scopus - Z. Wang, Y. Liu, and X. Liu, “On global asymptotic stability of neural networks with discrete and distributed delays,”
*Physics Letters*, vol. 345, no. 4–6, pp. 299–308, 2005. View at Publisher · View at Google Scholar · View at Scopus - C.-Y. Cheng, K.-H. Lin, and C.-W. Shih, “Multistability and convergence in delayed neural networks,”
*Physica D*, vol. 225, no. 1, pp. 61–74, 2007. View at Publisher · View at Google Scholar · View at Scopus - S. Mou, H. Gao, J. Lam, and W. Qiang, “A new criterion of delay-dependent asymptotic stability for Hopfield neural networks with time delay,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 3, pp. 532–535, 2008. View at Publisher · View at Google Scholar · View at Scopus - S. I. Niculescu,
*Delay Effects on Stability: A Robust Approach*, Springer, Berlin, Germany, 2001. - J. Liang, J. Cao, and J. Lam, “Convergence of discrete-time recurrent neural networks with variable delay,”
*International Journal of Bifurcation and Chaos*, vol. 15, no. 2, pp. 581–595, 2005. View at Publisher · View at Google Scholar · View at Scopus - J. Liang, J. Cao, and D. W. C. Ho, “Discrete-time bidirectional associative memory neural networks with variable delays,”
*Physics Letters Section A*, vol. 335, no. 2-3, pp. 226–234, 2005. View at Publisher · View at Google Scholar · View at Scopus - S. Mohamad, “Global exponential stability in continuous-time and discrete-time delayed bidirectional neural networks,”
*Physica D*, vol. 159, no. 3-4, pp. 233–251, 2001. View at Publisher · View at Google Scholar · View at Scopus - C. Sun and C.-B. Feng, “Discrete-time analogues of integrodifferential equations modeling neural networks,”
*Physics Letters A*, vol. 334, no. 2-3, pp. 180–191, 2005. View at Publisher · View at Google Scholar · View at Scopus - X.-G. Liu, M.-L. Tang, R. Martin, and X.-B. Liu, “Discrete-time BAM neural networks with variable delays,”
*Physics Letters A*, vol. 367, no. 4-5, pp. 322–330, 2007. View at Publisher · View at Google Scholar · View at Scopus - H. Zhao, L. S. Li Sun, and G. Wang, “Periodic oscillation of discrete-time bidirectional associative memory neural networks,”
*Neurocomputing*, vol. 70, no. 16–18, pp. 2924–2930, 2007. View at Publisher · View at Google Scholar · View at Scopus - W. He and J. Cao, “Stability and bifurcation of a class of discrete-time neural networks,”
*Applied Mathematical Modelling*, vol. 31, no. 10, pp. 2111–2122, 2007. View at Publisher · View at Google Scholar · View at Scopus - B. Cessac, “A discrete time neural network model with spiking neurons,”
*Journal of Mathematical Biology*, vol. 56, no. 3, pp. 311–345, 2008. View at Publisher · View at Google Scholar · View at Scopus - Y. Liu, Z. Wang, and X. Liu, “Asymptotic stability for neural networks with mixed time-delays: the discrete-time case,”
*Neural Networks*, vol. 22, no. 1, pp. 67–74, 2009. View at Publisher · View at Google Scholar · View at Scopus - J. Cao and Q. Song, “Global dissipativity on uncertain discrete-time neural networks with time-varying delays,”
*Discrete Dynamics in Nature and Society*, vol. 2010, Article ID 810408, 19 pages, 2010. View at Publisher · View at Google Scholar · View at Scopus - T. Ensari and S. Arik, “New results for robust stability of dynamical neural networks with discrete time delays,”
*Expert Systems with Applications*, vol. 37, no. 8, pp. 5925–5930, 2010. View at Publisher · View at Google Scholar · View at Scopus - A. Y. Alanis, E. N. Sanchez, A. G. Loukianov, and E. A. Hernandez, “Discrete-time recurrent high order neural networks for nonlinear identification,”
*Journal of the Franklin Institute*, vol. 347, no. 7, pp. 1253–1265, 2010. View at Publisher · View at Google Scholar · View at Scopus - Z. Huang, X. Wang, and C. Feng, “Multiperiodicity of periodically oscillated discrete-time neural networks with transient excitatory self-connections and sigmoidal nonlinearities,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 10, pp. 1643–1655, 2010. View at Publisher · View at Google Scholar · View at Scopus - C. Li, S. Wu, G. G. Feng, and X. Liao, “Stabilizing effects of impulses in discrete-time delayed neural networks,”
*IEEE Transactions on Neural Networks*, vol. 22, no. 2, pp. 323–329, 2011. View at Publisher · View at Google Scholar · View at Scopus - O. Faydasicok and S. Arik, “Robust stability analysis of a class of neural networks with discrete time delays,”
*Neural Networks*, vol. 29-30, pp. 52–59, 2012. View at Publisher · View at Google Scholar · View at Scopus - Y. Li and C. Wang, “Existence and global exponential stability of equilibrium for discrete-time fuzzy BAM neural networks with variable delays and impulses,”
*Fuzzy Sets Systems*, vol. 217, pp. 62–79, 2013. View at Google Scholar