Exponential Stability for Discrete-Time Stochastic BAM Neural Networks with Discrete and Distributed Delays
R. Raja,1R. Sakthivel,2and S. Marshal Anthoni3
Academic Editor: C.-K. Lin, N. I. Trinajstić
Received13 Sept 2011
Accepted23 Oct 2011
Published16 Jan 2012
Abstract
This paper deals with the stability analysis problem for a class of discrete-time stochastic
BAM neural networks with discrete and distributed time-varying delays. By constructing a suitable
Lyapunov-Krasovskii functional and employing M-matrix theory, we find some sufficient
conditions ensuring the global exponential stability of the equilibrium point for stochastic BAM
neural networks with time-varying delays. The conditions obtained here are expressed in terms
of LMIs whose feasibility can be easily checked by MATLAB LMI Control toolbox. A numerical
example is presented to show the effectiveness of the derived LMI-based stability conditions.
1. Introduction
Recently, the study of Bidirectional associative memory neural networks has attracted the attention of many researchers due to its applications in many fields such as pattern recognition, automatic control, associative memory, signal processing, and optimization; see, for example, [1–9]. The (BAM) neural networks model, proposed by Kosko [10, 11], is a two layer nonlinear feedback network model and it was described that the neurons in one layer are fully interconnected to the neurons in the other layer, while there are no interconnections among neurons in the same layer.
Furthermore, due to the finite switching speed of neuron amplifiers and the finite speed of signal propagation time delays are unavoidable in the implementation of neural networks [12–14]. According to the way it occurs, time delay can be classified as two types: discrete and distributed delays. Discrete time-delay is relatively easier to be identified in practice and hence the stability analysis for BAM with discrete delays has been an attractive subject of research in the past few years; see [15, 16]. On the other hand, due to the presence of an amount of parallel pathways of a variety of axon sizes and lengths, a neural network usually has a spatial nature. Therefore, it is necessary to introduce continuously distributed delays over a certain duration of time; see [17, 18].
Moreover, in implementations of neural networks, stochastic disturbances are inevitable owing to thermal noise in electronic devices. Practically, the stochastic phenomenon usually appears in the electrical circuit design of neural networks. The stochastic effects can have the ability to destabilize a neural system. Therefore, it is significant and of importance to consider stochastic effects to the stability property of the neural networks with delays. It is noted that most of the BAM neural networks have been assumed to act in a continuous-time manner. However, when it comes to the implementation of discrete-time BAM networks, there are only few works appeared in the literature; see [6, 19–24] and the references cited therein. Therefore, there is a crucial need to study the dynamics of discrete-time BAM neural networks and it becomes more significant from practical point of view. In [19], Gao and Cui discussed the global robust exponential stability of discrete-time interval BAM neural networks with time-varying delays, and in [24], the authors investigated the global exponential stability for discrete-time BAM neural network with time variable delay. In the above said references the stability problem for BAM neural networks is considered only with discrete delay, and distributed delay has not been taken into account and remains challenging. So, our main aim in this work is to make the first attempt to shorten such a gap.
Motivated by the above points, in this paper, we will study the exponential stability problem for a new class of discrete-time stochastic BAM neural networks with both discrete and distributed delays. The existence of the equilibrium point is proved under mild conditions on the activation functions. By constructing an appropriate Lyapunov-Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the discrete-time BAM neural networks to be globally exponentially stable in the mean square. Here, we note that the LMIs can be easily solved by using Matlab LMI toolbox, and no tuning of parameters is involved. Finally, a numerical example is presented to show the usefulness of the derived LMI-based stability conditions.
Notations. Throughout this paper, and denote, respectively, the -dimensional Euclidean space and the set of all real matrices. denotes the identity matrix with appropriate dimensions and denotes the diagonal matrix. For real symmetric matrices and , the notation (resp., ) means that the matrix is positive semidefinite (resp., positive definite). and stands for the Euclidean norm in . (resp., ) stands for the maximum (resp., minimum) eigenvalue of the matrix . The symbol within a matrix represents the symmetric term of the matrix.
2. Problem Description and Preliminaries
Consider the following discrete-time stochastic BAM neural networks with both discrete and distributed delays of the following form:
or, in an equivalent form,
for , where and are the neural state vector; and are the state feedback coefficient matrices; , and are, respectively, the connection weight matrices, the discretely delayed connection weight matrices, and distributed delayed connection weight matrices; and denote the discrete time-varying delays satisfying
where , and are known positive integer; denotes the distributed time-varying delays.Then
denote the neuron activation functions. The constant vectors and are the external inputs from outside the system; and are scalar constants, where and are scalar Wiener process (Brownian motion) on the probability space with
with being the mathematical expectation operator; and are the nonlinear vector function representing the disturbance intensities.
In this paper, we make following assumptions for the neuron activation functions.
Assumption 1. For , the neuron activation functions , , , , , and in (2.2) are continuous as well as bounded on .
Assumption 2. For , the neuron activation functions in (2.2) satisfy
where , and are some constants.
Remark 2.1. Assumption 2 was first introduced by Liu et al. [25]. The constants , and in Assumption 2 are allowed to be positive, negative, or zero. So, the activation functions used in this paper may be nonmonotonic and more general than the usual sigmoid functions and Lipschitz functions. Such conditions are very rude in quantifying the lower and upper bounds of the activation functions; hence we use generalized activation functions, because it is very helpful for using LMI-based technique to reduce the possible conservatism. In order to simplify our proof, we shift the equilibrium point and of system (2.2) to the origin. Let and ; then system (2.2) can be transformed to
where
Assumption 3. Obviously, the activation functions satisfy the following condition:
Assumption 4. The constants satisfy the following convergent conditions:
Remark 2.2. Assumption 4 ensures that the terms and are convergent, which is significant for the subsequent analysis.
Assumption 5. There exist constant matrices and such that
The following definition and lemmas will be essential in employing the exponential stability conditions.
Definition 2.3. The delayed discrete-time stochastic BAM neural network (2.7) is said to be globally exponentially stable, if there exist two positive scalars and such that
Lemma 2.4. Let and be any n-dimensional real vectors and let be an positive semidefinite matrix. Then, the following matrix inequality holds:
Lemma 2.5. Let be a positive semidefinite matrix, , and . If the series concerned are convergent, the following inequality holds:
In the rest of the paper, we will focus on the stability analysis of SBAMNN (2.7). By choosing an appropriate Lyapunov-Krasovskii functional, we aim to develop an LMI approach for deriving sufficient conditions under which the SBAMNN (2.7) is globally exponentially stable.
3. Main Results
Now, we are in a position to state our main results in the following theorem.
Theorem 3.1. Under Assumptions 1–5, the discrete-time stochastic BAM neural network (2.7) is globally exponentially stable in the mean square, if there exist constants and if there exist diagonal matrices , , , , , and and positive definite matrices , such that the following LMIs hold:
where
Proof. Let us choose the Lyapunov-Krasovskii functional as
In order to analyze the global exponential stability of the stochastic BAM neural network, we calculate differences of the Lyapunov function , along with the trajectories of the BAM neural network (2.7); then we have
where
By using Lemma 2.4, we have
It is clear from (2.9) that
which is equivalent to
where denotes the unit column vector having “1” element on its th row and zeros elsewhere. Consequently,
Similarly, from (3.10)–(3.14), we have
Then from (3.5)–(3.8) and (3.16)–(3.21), we obtain
where . Therefore, if the LMIs (3.1) hold, it can be concluded that . It follows that . By (3.22), the SBAMNN is globally asymptotically stable in the mean square. Now, we are in a position to establish the exponential stability of the SBAMNN (2.7). Then, there exists a scalar such that
From (3.3), it can be verified that
where and . Choose a scalar , satisfying
Then by (3.23) and (3.24), we have
where and . Therefore, for any integer and , summing up both sides of (3.26) from 0 to with respect to , we have
Here, we note that for , , we have
Substituting (3.28) in (3.27) gives
We can observe that
It follows easily from (3.24) that
Then, it follows from (3.25), (3.29), and (3.31) that
where and
This indicates that the discrete-time stochastic BAM neural network (2.7) is said to be globally exponentially stable. This completes the proof of this theorem.
For a deterministic BAM neural network, we have the following system of equations:
Then, by Theorem 3.1, it is very easy to obtain the following theorem.
Theorem 3.2. Under Assumptions 1–4, the discrete-time BAM neural network (3.34) is globally exponentially stable, if there exist diagonal matrices , , , , and and positive definite matrices , and , such that the following LMIs hold:
where
and are defined in Theorem 3.1.
Proof. Similar to the proof of Theorem 3.1, we can derive the stability result. The proof is straightforward and hence omitted.
If we neglect the distributed delay term in (2.2), it can be reduced to
For system (3.37), we have the following stability result.
Corollary 3.3. Under Assumptions 1–5, the discrete-time BAM neural network (3.37) is globally exponentially stable, if there exist diagonal matrices , , , and , and positive definite matrices and , such that the following LMI holds:
where
and are defined in Theorem 3.1.
4. Numerical Example
To illustrate the effectiveness of our stability criterion, we give the following numerical example.
Example 4.1. Consider the SBAM neural networks (2.2) with the following parameters:
It can be verified that , , , , , , , , , , , , and with
By using Matlab LMI toolbox, we solve the LMIs (3.1) in Theorem 3.1 and obtain the feasible solutions as follows:
Then, it follows from Theorem 3.1 that the SBAMNN (2.7) with given parameters is globally exponentially stable in the mean square. Our main purpose in this example is to estimate the maximum allowable upper bound delay and for given lower bound and (Table 1). For instance, if we set , the allowable time delay upper bound obtained by Gao and Cui [19] is 4. However, in our paper, we obtained that for any time delay satisfying , . This is much larger than that in [19], which shows the less conservativeness of our developed method (Figure 1).
(a)
(b)
(c)
(d)
(e)
(f)
5. Conclusion
In this paper, we have considered the stability analysis problem for a class of discrete-time stochastic BAM neural networks with both discrete and distributed delays. Employing a Lyapunov-Krasovskii functional and a Linear matrix inequality approach has been developed to establish sufficient conditions for the SBAMNNs to be globally exponentially stable. It has been shown that the delayed SBAMNNs are globally exponentially stable if some LMIs are solvable and the feasibility of such LMIs can be easily checked by using the numerically efficient LMI toolbox in Matlab. A numerical example has been given to demonstrate the effectiveness of the obtained stability conditions.
Acknowledgments
The work of the first author was supported by UGC Rajiv Gandhi National Fellowship. The work of the second author was supported by the Korean Research Foundation Grant funded by the Korean Government with Grant no. KRF 2010-0003495 and the work of the third author was supported by the CSIR, New Delhi. The authors are very much thankful to the reviewers and editors for their valuable comments and suggestions for improving this work.
References
B. T. Cui and X. Y. Lou, “Global asymptotic stability of BAM neural networks with distributed delays and reaction-diffusion terms,” Chaos, Solitons and Fractals, vol. 27, no. 5, pp. 1347–1354, 2006.
X. Liao and K. W. Wong, “Robust stability of interval bidirectional associative memory neural network with time delays,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 34, no. 2, pp. 1142–1154, 2004.
X. Lou and B. Cui, “On the global robust asymptotic stability of BAM neural networks with time-varying delays,” Neurocomputing, vol. 70, no. 1-3, pp. 273–279, 2006.
X. Lou and B. Cui, “Global asymptotic stability of delay BAM neural networks with impulses based on matrix theory,” Applied Mathematical Modelling, vol. 32, no. 2, pp. 232–239, 2008.
Z. Liu, A. Chen, J. Cao, and L. Huang, “Existence and global exponential stability of periodic solution for BAM neural networks with periodic coefficients and time-varying delays,” IEEE Transactions on Circuits and Systems, vol. 50, no. 9, pp. 1162–1173, 2003.
M. Liu, “Global asymptotic stability analysis of discrete-time Cohen-Grossberg neural networks based on interval systems,” Nonlinear Analysis, vol. 69, no. 8, pp. 2403–2411, 2008.
Y. Li, “Global exponential stability of BAM neural networks with delays and impulses,” Chaos, Solitons and Fractals, vol. 24, no. 1, pp. 279–285, 2005.
R. Samidurai, R. Sakthivel, and S. M. Anthoni, “Global asymptotic stability of BAM neural networks with mixed delays and impulses,” Applied Mathematics and Computation, vol. 212, no. 1, pp. 113–119, 2009.
L. Sheng and H. Yang, “Novel global robust exponential stability criterion for uncertain BAM neural networks with time-varying delays,” Chaos, Solitons and Fractals, vol. 40, no. 5, pp. 2102–2113, 2009.
J. Tian and X. Zhou, “Improved asymptotic stability criteria for neural networks with interval time-varying delay,” Expert Systems with Applications, vol. 37, no. 12, pp. 7521–7525, 2010.
J. Tian and S. Zhong, “Improved delay-dependent stability criterion for neural networks with time-varying delay,” Applied Mathematics and Computation, vol. 217, no. 24, pp. 10278–10288, 2011.
J. Tian and X. Xie, “New asymptotic stability criteria for neural networks with time-varying delay,” Physics Letters. A, vol. 374, no. 7, pp. 938–943, 2010.
S. Arik, “Global asymptotic stability analysis of bidirectional associative memory neural networks with time delays,” IEEE Transactions on Neural Networks, vol. 16, no. 3, pp. 580–586, 2005.
J. Cao and M. Dong, “Exponential stability of delayed bi-directional associative memory networks,” Applied Mathematics and Computation, vol. 135, no. 1, pp. 105–112, 2003.
Q. Song, “Exponential stability of recurrent neural networks with both time-varying delays and general activation functions via LMI approach,” Neurocomputing, vol. 71, no. 13–15, pp. 2823–2830, 2008.
J. Tian and S. Zhong, “New delay-dependent exponential stability criteria for neural networks with discrete and distributed time-varying delays,” Neurocomputing, vol. 74, no. 17, pp. 3365–3375, 2011.
M. Gao and B. Cui, “Global robust exponential stability of discrete-time interval BAM neural networks with time-varying delays,” Applied Mathematical Modelling, vol. 33, no. 3, pp. 1270–1284, 2009.
J. Liang and J. Cao, “Exponential stability of continuous-time and discrete-time bidirectional associative memory networks with delays,” Chaos, Solitons and Fractals, vol. 22, no. 4, pp. 773–785, 2004.
S. Mohamad, “Global exponential stability in continuous-time and discrete-time delayed bidirectional neural networks,” Physica D. Nonlinear Phenomena, vol. 159, no. 3-4, pp. 233–251, 2001.
R. Raja and S. M. Anthoni, “Global exponential stability of BAM neural networks with time-varying delays: the discrete-time case,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 2, pp. 613–622, 2011.
R. Raja, R. Sakthivel, and S. M. Anthoni, “Stability analysis for discrete-time stochastic neural networks with mixed time delays and impulsive effects,” Canadian Journal of Physics, vol. 88, no. 12, pp. 885–898, 2010.
X.-G. Liu, M. Wu, M.-L. Tang, and X.-B. Liu, “Global exponential stability for discrete-time BAM neural network with variable delay,” in Proceedings of the IEEE International conference on control and Automation, pp. 3139–3143, May 2007.
X. G. Liu, M. L. Tang, R. Martin, and X. B. Liu, “Discrete-time BAM neural networks with variable delays,” Physics Letters, vol. 367, no. 4-5, pp. 322–330, 2007.