Discrete Dynamics in Nature and Society
Volume 2009 (2009), Article ID 201068, 12 pages
http://dx.doi.org/10.1155/2009/201068
Dynamical Analysis of DTNN with Impulsive Effect
School of Sciences, Jimei University, Xiamen 361021, China
Received 11 March 2009; Accepted 30 September 2009
Academic Editor: Yong Zhou
Copyright © 2009 Chao Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract
We present dynamical analysis of discrete-time delayed neural networks with impulsive effect. Under impulsive effect, we derive some new criteria for the invariance and attractivity of discrete-time neural networks by using decomposition approach and delay difference inequalities. Our results improve or extend the existing ones.
1. Introduction
As we know, the mathematical model of neural network consists of four basic components: an input vector, a set of synaptic weights, summing function with an activation, or transfer function, and an output. From the view point of mathematics, an artificial neural network corresponds to a nonlinear transformation of some inputs into certain outputs. Due to their promising potential for tasks of classification, associative memory, parallel computation and solving optimization problems, neural networks architectures have been extensively researched and developed [1–25]. Most of neural models can be classified as either continuous-time or discrete-time ones. For relative works, we can refer to [20, 24, 26–28].
However, besides the delay effect, an impulsive effect likewise exists in a wide variety of evolutionary process, in which states are changed abruptly at certain moments of time [5, 29]. As is well known, stability is one of the major problems encountered in applications, and has attracted considerable attention due to its important role in applications. However, under impulsive perturbation, an equilibrium point does not exist in many physical systems, especially, in nonlinear dynamical systems. Therefore, an interesting subject is to discuss the invariant sets and the attracting sets of impulsive systems. Some significant progress has been made in the techniques and methods of determining the invariant sets and attracting sets for delay difference equations with discrete variables, delay differential equations, and impulsive functional differential equations [30–37]. Unfortunately, the corresponding problems for discrete-time neural networks with impulses and delays have not been considered. Motivated by the above-mentioned papers and discussion, we here make a first attempt to arrive at results on the invariant sets and attracting sets of discrete-time neural networks with impulses and delays.
2. Preliminaries
In this paper, we consider the following discrete-time networks under impulsive effect:
where are real constants, ; , are positive real numbers such that is an impulsive sequence such that and are real-valued functions.
By a solution of (2.1), we mean a piecewise continues real-valued function defined on the interval which satisfies (2.1) for all .
In the sequel, by we will denote the set of all continuous real-valued function defined on the interval , which satisfies the compatibility condition:
By the method of steps, one can easily see that, for any given initial function , there exists a unique solution of (2.1) which satisfies the initial condition:
this function will be called the solution of the initial problem (2.1)–(2.3).
For convenience, we rewrite (2.1) and (2.3) into the following vector form
where , , , , , , and , in which
In what follows, we will introduce some notations and basic definitions.
Let be the space of -dimensional real column vectors and let denote the set of real matrices. denotes an identical matrix with appropriate dimensions. For or means that each pair of corresponding elements of and satisfies the inequality . Particularly, is called a nonnegative matrix if and is denoted by and is called a positive vector if . denotes the spectral radius of .
denotes the space of continuous mappings from the topological space to the topological space .
and exist for for and except for points , where is an interval, and denote the right limit and left limit of function , respectively. Especially, let .
Definition 2.1. The set is called a positive invariant set of (2.4), if for any initial value , one has the solution for .
Definition 2.2. The set is called a global attracting set of (2.4), if for any initial value , the solution converges to as That is, where In particular, is called asymptotically stable.
Following [33], we split the matrices into two parts, respectively,
with ,, ,,,
Then the first equation of (2.4) can be rewritten as
Now take the symmetric transformation . From (2.7), it follows that
where
Set
By virtue of (2.8) and (2.7), we have
Set , in view of the impulsive part of (2.4), we also have , , and so we have
where
Lemma 2.3 (see [34]). Suppose that and , then there exists a positive vector such that
For and , one denotes
By Lemma 2.3, we have the following result.
Lemma 2.4. is nonempty, and for any scalars and vectors , one has
Lemma 2.5. Assume that satisfy
where ,
If , then there exists a positive vector such that
where is a constant and defined as
for the given .
Proof. Since and , by Lemma 2.3, there exists a positive vector such that .
Set then we have
Due to
there must exist a , such that
For , , there exists a positive constant such that
where
By Lemma 2.4, . Without loss of generality, we can find a such that
Set substituting this into (2.15), we have
By (2.23), we get that
Next, we will prove for any ,
To this end, we consider an arbitrary number , we claim that
Otherwise, by the continuity of , there must exist a and index such that
Then, by using (2.24) and (2.28), from (2.22), we obtain
which is a contradiction. Hence, (2.27) holds for all numbers . It follows immediately that (2.26) is always satisfied, which can easily be led to (2.16). This completes the proof.
3. Main Results
For convenience, we introduce the following assumptions.
For any , there exist a nonnegative matrix and a nonnegative vector such that For any , there exist nonnegative matrices and a nonnegative vector such that Also and where Also is nonempty.Theorem 3.1. Assume that hold. Then there exists a positive vector such that is a positive invariant set of (2.4), where .
Proof. From and , we can claim that for any ,
where , and satisfied , ,
So, by using (2.10) and (2.11) and taking into account (3.3), we get
respectively.
By assumptions , and Lemma 2.3, there exists a positive vector such that
Since and are positive constant vectors, by (3.6), there must exist two scalars , such that
respectively.
Set
by Lemma 2.4, clearly, and
Next, we will prove, for any , that is, ,
In order to prove (3.11), we first prove, for any ,
If (3.12) is false, by the piecewise continuous nature of , there must exist a and an index such that
Denoting , we get
This is a contradiction and hence (3.12) holds. From the fact that (3.12) is fulfilled for any , it follows immediately that (3.11) is always satisfied.
On the other hand, by using (3.5), (3.10), and (3.11), we obtain that
Therefore, we can claim that
In a similar way to the proof of (3.11), we can proof that (3.16) implies
Hence, by the induction principle, we conclude that
which implies holds for any , that is, for any . This is completes the proof of the theorem.
Remark 3.2. In fact, under the assumptions of Theorem 3.1, the must exist, for example, since and imply and , respectively, so we may take as the follows:
Theorem 3.3. If assumptions hold, then the is a global attracting set of (2.4), where , , and the vector is chosen as (3.19).
Proof. From (3.4), assumption and Lemma 2.5, and taking into account the definition of , we obtain that where the positive vector and satisfying From (3.15) and taking into account the definition of , we get that Therefore, we have that By using (3.20), (3.23) and Lemma 2.5 again, we obtain that Hence, by the induction principle, we conclude that which implies that the conclusion holds. The proof is complete.
4. An Illustrative Example
Consider the system (2.1) with the following parameters , , , , , , , , ,
From the given parameters, we have
Obviously, according to Theorems 3.1 and 3.3, the is the invariant and global attracting set of (2.4).
5. Conclusion
In this paper, by using -matrix theory and decomposition approach, some new criteria for the invariance and attractivity of discrete-time neural networks have been obtained. Moreover, these conditions can be easily checked in practice.
Acknowledgments
This work was supported by the Foundation of Education of Fujian Province, China(JA07142), the Scientic Research Foundation of Fujian Province, China(JA09152), the Foundation for Young Professors of Jimei University, the Scientic Research Foundation of Jimei University, and the Foundation for Talented Youth with Innovation in Science and Technology of Fujian Province (2009J05009).
References
- M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 13, no. 5, pp. 815–826, 1983. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- G. A. Carpenter, M. A. Cohen, and S. Grossberg, “Computing with neural networks,” Science, vol. 235, no. 4793, pp. 1226–1227, 1987. View at Publisher · View at Google Scholar
- A. Guez, V. Protopopsecu, and J. Barhen, “On the stability, storage capacity and design of nonlinear continuous neural networks,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 18, no. 1, pp. 80–87, 1988. View at Publisher · View at Google Scholar
- A. M. Samoilenko and N. A. Perestyuk, Differential Equations with Impulses Effect, Visca Skola, Kiev, Ukraine, 1987.
- V. Lakshmikantham, D. D. Bainov, and P. S. Simeonov, Theory of Impulsive Differential Equations, World Scientific, Singapore, 1989.
- A. N. Michel, J. A. Farrell, and W. Porod, “Qualitative analysis of neural networks,” IEEE Transactions on Circuits and Systems, vol. 36, no. 2, pp. 229–243, 1989. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- J. H. Li, A. N. Michel, and W. Porod, “Qualitative analysis and synthesis of a class of neural networks,” IEEE Transactions on Circuits and Systems, vol. 35, no. 8, pp. 976–986, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- X. X. Liao, “Stability of Hopfield-type neural networks. I,” Science in China, vol. 38, pp. 407–418, 1993. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- K. Gopalsamy and X. Z. He, “Stability in asymmetric Hopfield nets with transmission delays,” Physica D, vol. 76, no. 4, pp. 344–358, 1994. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- K. Gopalsamy and X. He, “Delay-independent stability in bidirectional associative memory networks,” IEEE Transactions on Neural Networks, vol. 5, no. 6, pp. 998–1002, 1994. View at Publisher · View at Google Scholar
- X. Liao and J. Yu, “Qualitative analysis of bidirectional associative memory with time delays,” International Journal of Circuit Theory and Applications, vol. 26, no. 3, pp. 219–229, 1998. View at Publisher · View at Google Scholar
- S. Mohamad and K. Gopalsamy, “Dynamics of a class of discrete-time neural networks and their continuous-time counterparts,” Mathematics and Computers in Simulation, vol. 53, no. 1-2, pp. 1–39, 2000. View at Publisher · View at Google Scholar · View at MathSciNet
- J. Cao, “Periodic solutions and exponential stability in delayed cellular neural networks,” Physical Review E, vol. 60, no. 3, pp. 3244–3248, 1999. View at Google Scholar · View at MathSciNet
- J. Cao, “New results concerning exponential stability and periodic solutions of delayed cellular neural networks,” Physics Letters A, vol. 307, no. 2-3, pp. 136–147, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- J. Cao and J. Liang, “Boundedness and stability for Cohen-Grossberg neural network with time-varying delays,” Journal of Mathematical Analysis and Applications, vol. 296, no. 2, pp. 665–685, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- J. Cao and J. Wang, “Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays,” Neural Networks, vol. 17, no. 3, pp. 379–390, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
- J. Cao and Q. Song, “Stability in Cohen-Grossberg-type bidirectional associative memory neural networks with time-varying delays,” Nonlinearity, vol. 19, no. 7, pp. 1601–1617, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- S. Hu and J. Wang, “Global stability of a class of continuous-time recurrent neural networks,” IEEE Transactions on Circuits and Systems I, vol. 49, no. 9, pp. 1334–1347, 2002. View at Publisher · View at Google Scholar · View at MathSciNet
- S. Mohamad and K. Gopalsamy, “Neuronal dynamics in time varying environments: continuous and discrete time models,” Discrete and Continuous Dynamical Systems, vol. 6, no. 4, pp. 841–860, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- K. Gopalsamy, “Stability of artificial neural networks with impulses,” Applied Mathematics and Computation, vol. 154, no. 3, pp. 783–813, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- Z. Wang, Y. Liu, and X. Liu, “On global asymptotic stability of neural networks with discrete and distributed delays,” Physics Letters A, vol. 345, no. 4–6, pp. 299–308, 2005. View at Publisher · View at Google Scholar · View at Scopus
- Y. Wang, W. Xiong, Q. Zhou, B. Xiao, and Y. Yu, “Global exponential stability of cellular neural networks with continuously distributed delays and impulses,” Physics Letters A, vol. 350, no. 1-2, pp. 89–95, 2006. View at Publisher · View at Google Scholar · View at Scopus
- X. Y. Lou and B. T. Cui, “Global asymptotic stability of delay BAM neural networks with impulses,” Chaos, Solitons & Fractals, vol. 29, no. 4, pp. 1023–1031, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- Y. Yang and J. Cao, “Stability and periodicity in delayed cellular neural networks with impulsive effects,” Nonlinear Analysis: Real World Applications, vol. 8, no. 1, pp. 362–374, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- H. Zhao, “Global asymptotic stability of Hopfield neural network involving distributed delays,” Neural Networks, vol. 17, no. 1, pp. 47–53, 2004. View at Publisher · View at Google Scholar
- H. Akça, R. Alassar, V. Covachev, Z. Covacheva, and E. Al-Zahrani, “Continuous-time additive Hopfield-type neural networks with impulses,” Journal of Mathematical Analysis and Applications, vol. 290, no. 2, pp. 436–451, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- Y. Li and L. Lu, “Global exponential stability and existence of periodic solution of Hopfield-type neural networks with impulses,” Physics Letters A, vol. 333, no. 1-2, pp. 62–71, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- T. Chen and S. I. Amari, “Stability of asymmetric Hopfield networks,” IEEE Transactions on Neural Networks, vol. 12, no. 1, pp. 159–163, 2001. View at Publisher · View at Google Scholar
- D. D. Bainov and P. S. Simeonov, Impulsive Differential Equations: Periodic Solutions and Applications, Longman, London, UK, 1993.
- D. Xu, “Asymptotic behavior of nonlinear difference equations with delays,” Computers & Mathematics with Applications, vol. 42, no. 3–5, pp. 393–398, 2001. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- D. Xu, “Invariant and attracting sets of Volterra difference equations with delays,” Computers & Mathematics with Applications, vol. 45, no. 6–9, pp. 1311–1317, 2003. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- D. Xu and Z. Yang, “Attracting and invariant sets for a class of impulsive functional differential equations,” Journal of Mathematical Analysis and Applications, vol. 329, no. 2, pp. 1036–1044, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- Z. Huang and Y. Xia, “Exponential -stability of second order Cohen-Grossberg neural networks with transmission delays and learning behavior,” Simulation Modelling Practice and Theory, vol. 15, no. 6, pp. 622–634, 2007. View at Publisher · View at Google Scholar
- Z. Huang, S. Mohamad, X. Wang, and C. Feng, “Convergence analysis of general neural net-works under almost periodic stimuli,” International Journal of Circuit Theory and Applications, vol. 37, no. 6, pp. 733–750, 2008. View at Publisher · View at Google Scholar
- Z. Huang, S. Mohamad, and H. Bin, “Multistability of HNNs with almost periodic stimuli and continuously distributed delays,” International Journal of Systems Science, vol. 40, no. 6, pp. 615–625, 2009. View at Publisher · View at Google Scholar · View at Scopus
- T. Chu, Z. Zhang, and Z. Wang, “A decomposition approach to analysis of competitive-cooperative neural networks with delay,” Physics Letters A, vol. 312, no. 5-6, pp. 339–347, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
- J. P. LaSalle, The Stability of Dynamical Systems, SIAM, Philadelphia, Pa, USA, 1976. View at MathSciNet