Abstract
This paper considers the delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays. By constructing the new Lyapunov functional, the improved delay-dependent exponential stability criterion is derived in terms of linear matrix inequality (LMI). Moreover, in order to reduce the conservativeness, some slack matrices are introduced in this paper. Two numerical examples are presented to show the effectiveness and less conservativeness of the proposed method.
1. Introduction
Bidirectional associative memory (BAM) neural networks were first introduced by Kosko [1, 2]. It generalizes the single-layer autoassociative Hebbian correlator to a two-layer pattern-matched heteroassociative circuits. This class of neural networks has been successfully applied to pattern recognition and artificial intelligence. In those applications, it is very important to guarantee the designed neural networks to be stable. On the other hand, time delays are unavoidably encountered in the implementation of neural networks, and its existence will lead to instability, oscillation, and poor performances of neural networks. Therefore, the the asymptotic or exponential stability analysis for BAM neural networks with time delays has received great attention during the past years, see, for example, [3–15]. The obtained results are classified into two categories: delay-independent results [3–5, 7, 10] and delay-dependent results [6, 9, 11, 12, 14–16]. Generally speaking, the delay-dependent results are less conservative than the delay-independent ones, especially when the size of time delay is small.
It should be pointed out that all of the above-mentioned references are concerned with continuous-time BAM neural networks. However, when implementing the continuous-time neural networks for computer simulation, for experimental or computational purposes, it is essential to formulate a discrete-time system that is an analogue of the continuous-time recurrent neural networks [17]. Generally speaking, the stability analysis of continuous-time neural networks is not applicable to the discrete version. Therefore, the stability analysis for various discrete-time neural networks with time delays is widely studied in recent years [17–28]. Using M-matrix method, Liang and Cao [25] studied the exponential stability of continuous-time BAM neural network with constant delays and its discrete analogue. Liang et al. [26] also studied the dynamics of discrete-time BAM neural networks with variable time delays based on unreasonably severe constraints on the delay functions. By using the Lyapunov functional method and linear matrix inequality technique, the exponential stability and robust exponential stability for discrete-time BAM neural networks with variable delays were considered in [27, 28], respectively. However, the results presented in [27, 28] are still conservative, since the the Lyapunov functionals constructed in [27, 28] are so simple. Therefore, there is much room to improve the results in [27, 28].
In this paper, we present a new exponential stability criterion for discrete-time BAM neural networks with time-varying delays. Compared with the existing methods, the main contributions of this paper are as follows. Firstly, more general Lyapunov functional is employed to obtain the improved exponential stability criterion; secondly, in order to reduce the conservativeness, some slack matrices, which bring much flexibility in solving LMI, are introduced in this paper. The proposed exponential stability criterion is expressed in LMI which can be efficiently solved by LMI toolbox in Matlab. Finally, two numerical examples are presented to show that our result is less conservative than some existing ones [27, 28].
The organization of this paper is as follows. Section 2 presents problem formulation of discrete-time BAM neural networks with time delay-varying. In Section 3, our main result of this paper is established and some remarks are given. In Section 4, numerical examples are given to demonstrate the proposed method. Finally, conclusion is drawn in Section 5.
Notation
Throughout this paper, the superscript “” stands for the transpose of a matrix. and denote the -dimensional Euclidean space and set of all real matrices, respectively. A real symmetric
matrix denotes being a positive definite (positive
semidefinite) matrix. is used to denote an identity matrix with
proper dimension. For integers and denote the discrete interval . and stand for the maximum and minimum eigenvalues
of the symmetric matrix ,
respectively. Matrices, if not explicitly stated, are assumed to have compatible
dimensions. The symmetric terms in a symmetric matrix are denoted by .
2. Problem Formulation
Consider the following discrete-time BAM neural network with time-varying delays:where and are the states of the th neuron from the neural field and the th neuron from the neural field at time , respectively; denote the stability of internal neuron process on the -layer and the -layer, respectively; are real constants, and denote the synaptic connection weights; denote the activation functions of the th neuron from the neural field and the th neuron from the neural field , respectively; and denote the external constant inputs from outside the network acting on the th neuron from the neural field and the th neuron from the neural field , respectively. and represent time-varying interval delays satisfyingwhere , and are positive integers.
Throughout this paper, we make the following assumption on the activation functions .
(A1) The activation functions are bounded on (A2) For any there exist positive scalars such that
It is clear that under Assumption (A1) and (A2), system (2.1) has at least one equilibrium. In order to simplify our proof, we shift the equilibrium point of system (2.1) to the origin. Let , then system (2.1) can be transformed toObviously, the the activation functions satisfy the following conditions.
(A3) For any there exist positive scalars such thatAlso we assume that system (2.4) with initial valuewhere .
For convenience, we rewrite system (2.4) in the formwhere
From above analysis, we can see that the exponential stability problem of system (2.1) on equilibrium is changed into the zero stability problem of system (2.7). Therefore, in the following part, we will devote into the exponential stability analysis problem of system (2.7).
Before giving the main result, we will firstly introduce the following definition and lemmas.
Definition 2.1. The trivial solution of BAM neural network (2.7) is said to be globally exponentially stable, if there exist scalars and such that
Lemma 2.2 (see [29]). For any real vectors , and any matrix with appropriate dimensions, it follows that
3. Main Result
In this section, we are in a position to present the global exponential stability criterion of system (2.7).
Theorem 3.1. For given diagonal matrices . Under Assumptions (A1) and (A2), the system (2.7) is globally exponentially stable, if there exist matrices , and diagonal matrices , such that the following LMI holds: where
Proof. Choose the following
Lyapunov-Krasovskii candidate function of system (2.7) as
follows:whereand
Calculating the difference of along the trajectories of system (2.7), we can
obtain By Lemma 2.2, we
have whereand By inequalities (2.5), it is well known that
there exist positive diagonally matrices such that the following inequalities
hold:where .
Substituting (3.10)–(3.13) into (3.9), we
havewhere and are defined in Theorem 3.1. Thus, if LMI (3.1)
holds, it can be concluded that according to Schur complement, which implies
that .
Note thatSubstituting (3.22), (3.23) into
(3.21), and combining (3.17)–(3.21), then we havewhereOn the other
hand,Therefore, we can
obtainwhere .
Obviously, ,
by Definition 2.1, the system (2.7) or (2.1) is globally exponentially stable.
Remark 3.2. Based on the linear matrix inequality (LMI) technique and Lyapunov stability theory, the exponential stability for discrete-time delayed BAM neural network (2.1) was investigated in [26–28]. However, it should be noted that the results in [26] are based on the unreasonably severe constraints on the delay functions , and the results in [27, 28] are based on the simple Lyapunov functionals. In this paper, in order to obtain the less conservative stability criterion, the novel Lyapunov functional is employed, which contains , and more general than the traditional ones [26–28]. Moreover, some slack matrices, which bring much flexibility in solving LMI, are introduced in this paper.
Remark 3.3. By setting in Theorem 3.1, we can obtain the global asymptotic stability criterion of discrete-time BAM neural network (2.7).
Remark 3.4. Theorem 3.1 in this paper depends on both the delay upper bounds and the delay intervals and .
Remark 3.5. The proposed method in this paper can be generalized to more complex neural networks, such as delayed discrete-time BAM neural networks with parameter uncertainties [30, 31] and stochastic perturbations [30–32], delayed interval discrete-time BAM neural networks [28], and discrete-time analogues of BAM neural networks with mixed delays [32, 33].
4. Numerical Examples
Example 4.1. Consider the delayed discrete-time BAM neural network (2.7) with the following parameters:The activation functions satisfy Assumptions (A1) and (A2) with
For this example, by Theorem 3.1 in this paper with , we can obtain some maximum allowable delay bounds for guaranteeing the asymptotic stability of this system. For a comparison with [27, Theorem 1] and in [28, Corollary 1], we made Table 1. For this example, it is obvious that our result is less conservative than those in [27, 28].
Example 4.2 (see [27]). Consider the delayed
discrete-time BAM NN (2.7) with the following parameters:The activation functions satisfy
Assumptions (A1) and (A2) with
Now, we assume .
It is obvious that ,
that is, In this case, it is found that the exponential
conditions proposed in [27, 28] is not satisfied. However, it can be concluded
that this system is globally exponentially stable by using Theorem 3.1 in this
paper. If we assume Using
[27, Theorem 1] and [28, Corollary 1],
the achieved maximum allowable delays for guaranteeing the globally
exponentially stable of this system are respectively. However, we can obtain the delay
bound by using Theorem 3.1 in this paper. For this
example, it is seen that our result improve some existing results [27, 28].
5. Conclusion
In this paper, we consider the delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays. Based on the Lyapunov stability theory and linear matrix inequality technique, the novel delay-dependent stability criterion is obtained. Two numerical examples show that the proposed stability condition in this paper is less conservative than previously established ones.
Acknowledgments
The authors would like to thank the Associate Editor and the anonymous reviewers for their constructive comments and suggestions to improve the quality of the paper. This work was supported by the National Natural Science Foundation of China (no. 60643003) and the Natural Science Foundation of Henan Education Department (no. 2007120005).