Discrete Dynamics in Nature and Society

Volume 2008, Article ID 421614, 14 pages

http://dx.doi.org/10.1155/2008/421614

## Delay-Dependent Exponential Stability for Discrete-Time BAM Neural Networks with Time-Varying Delays

^{1}Department of Mathematics, Henan Institute of Science and Technology, Xinxiang 453003, China^{2}College of Mathematics and Information Science, Henan Normal University, Xinxiang 453007, China^{3}Key Laboratory of Measurement and Control of Complex Systems of Engineering, Ministry of Education, School of Automation, Southeast University, Nanjing 210096, China

Received 18 May 2008; Revised 5 August 2008; Accepted 10 September 2008

Academic Editor: Yong Zhou

Copyright © 2008 Yonggang Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

This paper considers the delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays. By constructing the new Lyapunov functional, the improved delay-dependent exponential stability criterion is derived in terms of linear matrix inequality (LMI). Moreover, in order to reduce the conservativeness, some slack matrices are introduced in this paper. Two numerical examples are presented to show the effectiveness and less conservativeness of the proposed method.

#### 1. Introduction

Bidirectional associative memory (BAM) neural networks were first introduced by Kosko [1, 2]. It generalizes the single-layer autoassociative Hebbian correlator to a two-layer pattern-matched heteroassociative circuits. This class of neural networks has been successfully applied to pattern recognition and artificial intelligence. In those applications, it is very important to guarantee the designed neural networks to be stable. On the other hand, time delays are unavoidably encountered in the implementation of neural networks, and its existence will lead to instability, oscillation, and poor performances of neural networks. Therefore, the the asymptotic or exponential stability analysis for BAM neural networks with time delays has received great attention during the past years, see, for example, [3–15]. The obtained results are classified into two categories: delay-independent results [3–5, 7, 10] and delay-dependent results [6, 9, 11, 12, 14–16]. Generally speaking, the delay-dependent results are less conservative than the delay-independent ones, especially when the size of time delay is small.

It should be pointed out that all of the above-mentioned references are concerned with continuous-time BAM neural networks. However, when implementing the continuous-time neural networks for computer simulation, for experimental or computational purposes, it is essential to formulate a discrete-time system that is an analogue of the continuous-time recurrent neural networks [17]. Generally speaking, the stability analysis of continuous-time neural networks is not applicable to the discrete version. Therefore, the stability analysis for various discrete-time neural networks with time delays is widely studied in recent years [17–28]. Using M-matrix method, Liang and Cao [25] studied the exponential stability of continuous-time BAM neural network with constant delays and its discrete analogue. Liang et al. [26] also studied the dynamics of discrete-time BAM neural networks with variable time delays based on unreasonably severe constraints on the delay functions. By using the Lyapunov functional method and linear matrix inequality technique, the exponential stability and robust exponential stability for discrete-time BAM neural networks with variable delays were considered in [27, 28], respectively. However, the results presented in [27, 28] are still conservative, since the the Lyapunov functionals constructed in [27, 28] are so simple. Therefore, there is much room to improve the results in [27, 28].

In this paper, we present a new exponential stability criterion for discrete-time BAM neural networks with time-varying delays. Compared with the existing methods, the main contributions of this paper are as follows. Firstly, more general Lyapunov functional is employed to obtain the improved exponential stability criterion; secondly, in order to reduce the conservativeness, some slack matrices, which bring much flexibility in solving LMI, are introduced in this paper. The proposed exponential stability criterion is expressed in LMI which can be efficiently solved by LMI toolbox in Matlab. Finally, two numerical examples are presented to show that our result is less conservative than some existing ones [27, 28].

The organization of this paper is as follows. Section 2 presents problem formulation of discrete-time BAM neural networks with time delay-varying. In Section 3, our main result of this paper is established and some remarks are given. In Section 4, numerical examples are given to demonstrate the proposed method. Finally, conclusion is drawn in Section 5.

*Notation*

Throughout this paper, the superscript “” stands for the transpose of a matrix. and denote the -dimensional Euclidean space and set of all real matrices, respectively. A real symmetric
matrix denotes being a positive definite (positive
semidefinite) matrix. is used to denote an identity matrix with
proper dimension. For integers and denote the discrete interval . and stand for the maximum and minimum eigenvalues
of the symmetric matrix ,
respectively. Matrices, if not explicitly stated, are assumed to have compatible
dimensions. The symmetric terms in a symmetric matrix are denoted by .

#### 2. Problem Formulation

Consider the following discrete-time BAM neural network with time-varying delays:where and are the states of the th neuron from the neural field and the th neuron from the neural field at time , respectively; denote the stability of internal neuron process on the -layer and the -layer, respectively; are real constants, and denote the synaptic connection weights; denote the activation functions of the th neuron from the neural field and the th neuron from the neural field , respectively; and denote the external constant inputs from outside the network acting on the th neuron from the neural field and the th neuron from the neural field , respectively. and represent time-varying interval delays satisfyingwhere , and are positive integers.

Throughout this paper, we make the following assumption on the activation functions .

(A1) The activation functions are bounded on (A2) For any there exist positive scalars such that

It is clear that under Assumption (A1) and (A2), system (2.1) has at least one equilibrium. In order to simplify our proof, we shift the equilibrium point of system (2.1) to the origin. Let , then system (2.1) can be transformed toObviously, the the activation functions satisfy the following conditions.

(A3) For any there exist positive scalars such thatAlso we assume that system (2.4) with initial valuewhere .

For convenience, we rewrite system (2.4) in the formwhere

From above analysis, we can see that the exponential stability problem of system (2.1) on equilibrium is changed into the zero stability problem of system (2.7). Therefore, in the following part, we will devote into the exponential stability analysis problem of system (2.7).

Before giving the main result, we will firstly introduce the following definition and lemmas.

*Definition 2.1. * The trivial solution of BAM neural network (2.7) is
said to be globally exponentially stable, if there exist scalars and such that

Lemma 2.2 (see [29]). *
For any real vectors ,
and any matrix with appropriate dimensions, it follows
that *

#### 3. Main Result

In this section, we are in a position to present the global exponential stability criterion of system (2.7).

Theorem 3.1. *For given diagonal matrices . Under Assumptions (A1) and
(A2), the system (2.7) is globally exponentially stable, if there exist matrices ,
and diagonal matrices ,
such that the following LMI holds: **where *

*Proof. *Choose the following
Lyapunov-Krasovskii candidate function of system (2.7) as
follows:whereand

Calculating the difference of along the trajectories of system (2.7), we can
obtain By Lemma 2.2, we
have whereand By inequalities (2.5), it is well known that
there exist positive diagonally matrices such that the following inequalities
hold:where .

Substituting (3.10)–(3.13) into (3.9), we
havewhere and are defined in Theorem 3.1. Thus, if LMI (3.1)
holds, it can be concluded that according to Schur complement, which implies
that .
Note thatSubstituting (3.22), (3.23) into
(3.21), and combining (3.17)–(3.21), then we havewhereOn the other
hand,Therefore, we can
obtainwhere .
Obviously, ,
by Definition 2.1, the system (2.7) or (2.1) is globally exponentially stable.

*Remark 3.2. ** * Based on the linear matrix inequality (LMI) technique and Lyapunov stability
theory, the exponential stability for discrete-time delayed BAM neural network
(2.1) was investigated in [26–28]. However, it should be noted that
the results in [26] are based on the unreasonably severe constraints on the
delay functions ,
and the results in [27, 28] are based on the simple Lyapunov functionals. In
this paper, in order to obtain the less conservative stability criterion, the
novel Lyapunov functional is employed, which contains ,
and more general than the traditional ones [26–28]. Moreover, some slack
matrices, which bring much flexibility in solving LMI, are introduced in this
paper.

*Remark 3.3. *By setting in Theorem 3.1, we can obtain the global
asymptotic stability criterion of discrete-time BAM neural network (2.7).

*Remark 3.4. *Theorem 3.1 in this paper depends on both the delay upper bounds and the delay intervals and .

*Remark 3.5. ** *The proposed method in this paper can be generalized to more complex neural
networks, such as delayed discrete-time BAM neural networks with parameter
uncertainties [30, 31] and stochastic perturbations [30–32], delayed interval
discrete-time BAM neural networks [28], and discrete-time analogues of BAM
neural networks with mixed delays [32, 33].

#### 4. Numerical Examples

*Example 4.1. *
Consider the delayed discrete-time BAM neural network (2.7) with the following
parameters:The activation functions satisfy
Assumptions (A1) and (A2) with

For this example, by Theorem 3.1 in this paper with , we can obtain some maximum allowable delay bounds for guaranteeing the asymptotic stability of this system. For a comparison with [27, Theorem 1] and in [28, Corollary 1], we made Table 1. For this example, it is obvious that our result is less conservative than those in [27, 28].

*Example 4.2 (see [27]). * Consider the delayed
discrete-time BAM NN (2.7) with the following parameters:The activation functions satisfy
Assumptions (A1) and (A2) with

Now, we assume .
It is obvious that ,
that is, In this case, it is found that the exponential
conditions proposed in [27, 28] is not satisfied. However, it can be concluded
that this system is globally exponentially stable by using Theorem 3.1 in this
paper. If we assume Using
[27, Theorem 1] and [28, Corollary 1],
the achieved maximum allowable delays for guaranteeing the globally
exponentially stable of this system are respectively. However, we can obtain the delay
bound by using Theorem 3.1 in this paper. For this
example, it is seen that our result improve some existing results [27, 28].

#### 5. Conclusion

In this paper, we consider the delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays. Based on the Lyapunov stability theory and linear matrix inequality technique, the novel delay-dependent stability criterion is obtained. Two numerical examples show that the proposed stability condition in this paper is less conservative than previously established ones.

#### Acknowledgments

The authors would like to thank the Associate Editor and the anonymous reviewers for their constructive comments and suggestions to improve the quality of the paper. This work was supported by the National Natural Science Foundation of China (no. 60643003) and the Natural Science Foundation of Henan Education Department (no. 2007120005).

#### References

- B. Kosko, “Adaptive bidirectional associative memories,”
*Applied Optics*, vol. 26, no. 23, pp. 4947–4960, 1987. View at Google Scholar - B. Kosko, “Bidirectional associative memories,”
*IEEE Transactions on Systems, Man and Cybernetics*, vol. 18, no. 1, pp. 49–60, 1988. View at Publisher · View at Google Scholar · View at MathSciNet - J. Cao and L. Wang, “Exponential stability and periodic oscillatory solution in BAM networks with delays,”
*IEEE Transactions on Neural Networks*, vol. 13, no. 2, pp. 457–463, 2002. View at Publisher · View at Google Scholar - J. Cao and M. Dong, “Exponential stability of delayed bi-directional associative memory networks,”
*Applied Mathematics and Computation*, vol. 135, no. 1, pp. 105–112, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. Li, “Existence and stability of periodic solution for BAM neural networks with distributed delays,”
*Applied Mathematics and Computation*, vol. 159, no. 3, pp. 847–862, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - X. Huang, J. Cao, and D.-S. Huang, “LMI-based approach for delay-dependent exponential stability analysis of BAM neural networks,”
*Chaos, Solitons & Fractals*, vol. 24, no. 3, pp. 885–898, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - S. Arik, “Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays,”
*IEEE Transactions on Neural Networks*, vol. 16, no. 3, pp. 580–586, 2005. View at Publisher · View at Google Scholar - M. Jiang, Y. Shen, and X. Liao, “Global stability of periodic solution for bidirectional associative memory neural networks with varying-time delay,”
*Applied Mathematics and Computation*, vol. 182, no. 1, pp. 509–520, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J. H. Park, “A novel criterion for global asymptotic stability of BAM neural networks with time delays,”
*Chaos, Solitons & Fractals*, vol. 29, no. 2, pp. 446–453, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - X. Lou and B. Cui, “On the global robust asymptotic stability of BAM neural networks with time-varying delays,”
*Neurocomputing*, vol. 70, no. 1–3, pp. 273–279, 2006. View at Publisher · View at Google Scholar - J. H. Park, “Robust stability of bidirectional associative memory neural networks with time delays,”
*Physics Letters A*, vol. 349, no. 6, pp. 494–499, 2006. View at Publisher · View at Google Scholar - J. Cao, D. W. C. Ho, and X. Huang, “LMI-based criteria for global robust stability of bidirectional associative memory networks with time delay,”
*Nonlinear Analysis: Theory, Methods & Applications*, vol. 66, no. 7, pp. 1558–1572, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - D. W. C. Ho, J. Liang, and J. Lam, “Global exponential stability of impulsive high-order BAM neural networks with time-varying delays,”
*Neural Networks*, vol. 19, no. 10, pp. 1581–1590, 2006. View at Publisher · View at Google Scholar - L. Sheng and H. Yang, “Novel global robust exponential stability criterion for uncertain BAM neural networks with time-varying delays,”
*Chaos, Solitons & Fractals*. In press. View at Publisher · View at Google Scholar - J. H. Park, S. M. Lee, and O. M. Kwon, “On exponential stability of bidirectional associative memory neural networks with time-varying delays,”
*Chaos, Solitons & Fractals*. In press. View at Publisher · View at Google Scholar - Y. Wang, “Global exponential stability analysis of bidirectional associative memory neural networks with time-varying delays,”
*Nonlinear Analysis: Real World Applications*. In press. View at Publisher · View at Google Scholar - Y. Liu, Z. Wang, A. Serrano, and X. Liu, “Discrete-time recurrent neural networks with time-varying delays: exponential stability analysis,”
*Physics Letters A*, vol. 362, no. 5-6, pp. 480–488, 2007. View at Publisher · View at Google Scholar - S. Hu and J. Wang, “Global stability of a class of discrete-time recurrent neural networks,”
*IEEE Transactions on Circuits and Systems I*, vol. 49, no. 8, pp. 1104–1117, 2002. View at Publisher · View at Google Scholar · View at MathSciNet - S. Mohamad and K. Gopalsamy, “Exponential stability of continuous-time and discrete-time cellular neural networks with delays,”
*Applied Mathematics and Computation*, vol. 135, no. 1, pp. 17–38, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - W. Xiong and J. Cao, “Global exponential stability of discrete-time Cohen-Grossberg neural networks,”
*Neurocomputing*, vol. 64, pp. 433–446, 2005. View at Publisher · View at Google Scholar - L. Wang and Z. Xu, “Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks,”
*IEEE Transactions on Circuits and Systems I*, vol. 53, no. 6, pp. 1373–1380, 2006. View at Publisher · View at Google Scholar · View at MathSciNet - W.-H. Chen, X. Lu, and D.-Y. Liang, “Global exponential stability for discrete-time neural networks with variable delays,”
*Physics Letters A*, vol. 358, no. 3, pp. 186–198, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - Q. Song and Z. Wang, “A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays,”
*Physics Letters A*, vol. 368, no. 1-2, pp. 134–145, 2007. View at Publisher · View at Google Scholar - B. Zhang, S. Xu, and Y. Zou, “Improved delay-dependent exponential stability criteria for discrete-time recurrent neural networks with time-varying delays,”
*Neurocomputing*. In press. View at Publisher · View at Google Scholar - J. Liang and J. Cao, “Exponential stability of continuous-time and discrete-time bidirectional associative memory networks with delays,”
*Chaos, Solitons & Fractals*, vol. 22, no. 4, pp. 773–785, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J. Liang, J. Cao, and D. W. C. Ho, “Discrete-time bidirectional associative memory neural networks with variable delays,”
*Physics Letters A*, vol. 335, no. 2-3, pp. 226–234, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH - X.-G. Liu, M.-L. Tang, R. Martin, and X.-B. Liu, “Discrete-time BAM neural networks with variable delays,”
*Physics Letters A*, vol. 367, no. 4-5, pp. 322–330, 2007. View at Publisher · View at Google Scholar - M. Gao and B. Cui, “Global robust exponential stability of discrete-time interval BAM neural networks with time-varying delays,”
*Applied Mathematical Modelling*. In press. View at Publisher · View at Google Scholar - M. S. Mahmoud,
*Resilient Control of Uncertain Dynamical Systems*, vol. 303 of*Lecture Notes in Control and Information Sciences*, Springer, Berlin, Germany, 2004. View at Zentralblatt MATH · View at MathSciNet - Z. Wang, S. Lauria, J. Fang, and X. Liu, “Exponential stability of uncertain stochastic neural networks with mixed time-delays,”
*Chaos, Solitons & Fractals*, vol. 32, no. 1, pp. 62–72, 2007. View at Publisher · View at Google Scholar · View at MathSciNet - Z. Wang, H. Shu, J. Fang, and X. Liu, “Robust stability for stochastic Hopfield neural networks with time delays,”
*Nonlinear Analysis: Real World Applications*, vol. 7, no. 5, pp. 1119–1128, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Z. Wang, Y. Liu, M. Li, and X. Liu, “Stability analysis for stochastic Cohen-Grossberg neural networks with mixed time delays,”
*IEEE Transactions on Neural Networks*, vol. 17, no. 3, pp. 814–820, 2006. View at Publisher · View at Google Scholar - Z. Wang, Y. Liu, and X. Liu, “On global asymptotic stability of neural networks with discrete and distributed delays,”
*Physics Letters A*, vol. 345, no. 4–6, pp. 299–308, 2005. View at Publisher · View at Google Scholar