- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Journal of Applied Mathematics
Volume 2008 (2008), Article ID 453627, 19 pages
Periodic Oscillation of Fuzzy Cohen-Grossberg Neural Networks with Distributed Delay and Variable Coefficients
1Department of Mathematics, Southeast University, Nanjing 210096, China
2Department of Mathematics, Xiangnan University, Chenzhou 423000, China
Received 4 July 2007; Accepted 6 December 2007
Academic Editor: George Jaiani
Copyright © 2008 Hongjun Xiang and Jinde Cao. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
A class of fuzzy Cohen-Grossberg neural networks with distributed delay and variable coefficients is discussed. It is neither employing coincidence degree theory nor constructing Lyapunov functionals, instead, by applying matrix theory and inequality analysis, some sufficient conditions are obtained to ensure the existence, uniqueness, global attractivity and global exponential stability of the periodic solution for the fuzzy Cohen-Grossberg neural networks. The method is very concise and practical. Moreover, two examples are posed to illustrate the effectiveness of our results.
Recently, Cohen-Grossberg neural networks (CGNNs) and fuzzy cellular neural networks (FCNNs) with their various models have attracted many scholars' attention due to their potential applications in classification, associative memory, parallel computation, image processing, and pattern recognition, especially in white blood cell detection and in the solution of some optimization problems, presented by [1–22]. It is well known that FCCNs [16–18] have been proposed by T. Yang and L. Yang in 1996. Unlike the traditional CNNs structures, FCNNs have fuzzy logic between their template input and output besides the sum of product operation. It is worth noting that studies have indicated FCNNs' potential applications in many fields such as image and signal processing, pattern recognition, white blood cell detection, and so on. Some results on stability have been derived from the FCNNs models without or with time delays; see [16–21]. In , the authors obtained a new detection algorithm based on FCNNs for white blood cell detection, with which one can detect almost all white blood cells and the contour of each detected cell is nearly completed.
It is also noted that CGNNs are presented by Cohen and Grossberg in 1983 . This model is a very common class of neural networks, which includes lots of famous neural networks such as Lotka-Volterra ecological system, Hopfield neural networks, and cellular neural networks, and so on. Up to now, some useful results were given to ensure the existence and stability of the equilibrium point of CGNNs without or with time delays, presented by [1–11, 13, 14, 15]. In , the boundedness and stability were analyzed for a class of CGNNs with time-varying delays using the inequalities technique and Lyapunov method. In [2, 8], by constructing suitable Lyaponuv functionals and using the linear matrix inequality (LMI) technique, the authors investigated a class of delayed CGNNs and obtained several criteria to guarantee the global asymptotic stability of the equilibrium point for this system. In [3, 6, 7, 9, 13, 15], the global exponential stability is discussed for a class of delayed CGNNs via nonsmooth analysis. In , the authors integrated fuzzy logic into the structure of CGNNs, maintained local connectedness among cells which are called fuzzy Cohen-Grossberg neural networks (FCGNNs), and studied impulsive effects on stability of FCGNNs with time-varying delays.
Due to the presence of an amount of parallel pathways of a variety of axon sizes and lengths, neural networks usually have spatial nature so it is desired to model them by introducing continuously distributed delays over a certain duration of time so that the distant past has less influence compared with the recent behavior of the state . Hence, some authors have researched neural networks with distributed delays; see, for example, [5, 20]. In , stability was considered for a class of FCNNs with distributed delay. The condition for feedback kernel is . In , following the idea of vector Lyapunov function, -matrix theory, and inequality technique, authors studied Cohen-Grossberg neural network model with both time-varying and continuously distributed delays and obtained several sufficient conditions to ensure the existence, uniqueness, and global exponential stability of equilibrium point for this system. The delay kernel is real-valued nonnegative continuous function and satisfies , where is continuous function in and .
Moreover, studies on neural dynamical systems do not only involve a discussion of stability properties, but also many other dynamic behaviors such as periodic oscillatory; see [21, 23, 24] and references therein. Although many results were derived from testing the stability of CGNNs [1–14], to the best of our knowledge, FCGNNs with distributed delay and variable coefficients are seldom considered. Motivated by the above discussions, the objective of this paper is to study the periodic oscillatory solutions of fuzzy Cohen-Grossberg neural networks with distributed delay and variable coefficients and to obtain several novel sufficient conditions to ensure the existence, uniqueness, global attractivity, and global exponential stability of periodic solutions for the model with periodic external inputs. In this paper, we do not use coincidence degree theory nor apply Lyapunov method but use inequality analysis and matrix theory to discuss them.
The rest of the paper is organized as follows. In Section 2, we will describe the model and introduce some necessary notations, definitions, and preliminaries which will be used later. In Section 3, several sufficient conditions are derived from the existence, uniqueness, global attractivity, and global exponential stability of periodic solutions. In Section 4, two examples are given to show the effectiveness of the obtained results. Finally, we make the conclusion in Section 5.
2. Model Description and Preliminaries
In this paper, we will consider the following fuzzy Cohen-Grossberg neural networks with periodic coefficients and distributed delays: where , is the number of neurons in the networks, denotes the neuron state vector; is an amplification function, denotes an appropriately behaved function, is the activation function of the neurons, and are elements of fuzzy feedback MIN template, fuzzy feedback MAX template, fuzzy feed-forward MIN template, and fuzzy feed-forward MAX template at the time , respectively. and , are elements of feedback template and feed-forward template at the time , respectively; and , are elements of fuzzy feedback MIN template and fuzzy feedback MAX template, respectively; and , are elements of fuzzy feed-forward MIN template and fuzzy feed-forward MAX template, respectively; and denote the fuzzy AND and fuzzy OR operations, respectively; denote input and bias of the th neurons at the time , respectively; is the feedback kernel, defined on the interval when is a positive finite number or while is infinite.
Let denote the Banach space of continuous mapping from to with the topology of uniform convergence. For a given and a continuous function , we define by for . The initial conditions of (2.1) are of the form , where is bounded and continuous on . For matrix , let denote the spectral radius of . A matrix or a vector means that all the elements of are greater than or equal to zero, similarly define . For define , , where
Remark 2.1. when
is a positive
finite number or while is infinite.
To obtain our results, we give the following assumptions. (A1) The functions are continuously bounded, that is, there exist positive constants and so that(A2) Each with is continuous and there exists a constant so that(A3) For activation function with , there exist so that(A4) The feedback kernels satisfy and Where is a positive finite number or infinite.
In addition, we will use the following notations throughout this paper:
Definition 2.2 (see ). Let . Suppose that is a Banach space and that is a given mapping. Define by . A process on is a mapping satisfying the following properties: (i) is continuous; (ii) is the identity; (iii)
A process is said to be an -periodic process if there is an such that and .
Remark 2.3. Suppose that is completely continuous and let denote a solution of the system through and assume is uniquely defined for . If for , then is a process on . If there is an such that for all , then the process generated by is an -periodic process. Furthermore, if is defined as before, then , where is the solution operator of satisfying .
Definition 2.4 (see [2.44]). A continuous map is said to be point dissipative if there exists a bounded set such that attracts each point of .
Lemma 2.5 (see ). If an -periodic generates an -periodic process on , is a bounded map for each and is point dissipative, then there exists a compact, connected, global attractor. Also, there is an -periodic solution of .
For , Lemma 2.5 provides the existence of a periodic solution under the weak assumption of point dissipative. Let system (2.1) be rewritten as the system * and one can obtain that system (2.1) can generate an -periodic process on (see [24, 25]). Under some assumptions, one can view system (2.1) as a dissipative system and apply Lemma 2.5 to it.
Lemma 2.6 (see [2.66]). If for , then , where denotes the identity matrix of size , denotes a square matrix of size .
Definition 2.7. For system (2.1), an -periodic solution is said to be globally exponentially stable if there exist constants and so that for the solution of system (2.1) with any initial value . Moreover, is called globally exponentially convergent rate.
3. Main Results and Proofs
In this section, by making use of some inequal analysis and matrix knowledge, we will derive some sufficient conditions ensuring the existence, uniqueness, global attractivity and global exponential stability of periodic solution of system (2.1). First and foremost, in order to ensure the existence of periodic attractor, we should find out an invariant set where the expected periodic attractor * is located.
Theorem 3.1. Assume that assumptions (A1)–(A4) hold and (A5), where and ;then the set is a positively invariant set of system (2.1), where vector , and , denotes the identity matrix of size .
Proof. Associated with system (2.1), (A1)–(A4), and Lemma 2.8, we have Applying the variation of constant formula for the above, we can derive that Together with and Lemma 2.6, one has . In view of , then . We will prove that if , then It suffices to show that for all , if , then Using the method of contrary. If not, there exist some and such that where is the th component of vector . Noticing that or , then according to (3.5) and (3.2), we have In view of , we can easily conclude that from (3.7), which is a contradiction. Therefore, (3.4) holds. Let , then we imply that (3.3) holds. The proof of Theorem 3.1 is completed.
In order to investigate the global attractivity of the periodic solutions, it is necessary to require global attractivity of the invariant set .
Proof. Using similar discussion to Theorem 3.1, for any given
initial value , there exists some constant such that if , then If this is not true, there exists a nonnegative vector such that By the definition of upper limit of and (3.8), for
any sufficiently small value , there exists some such that By the continuity of function , set , then we have for all .
According to (2.1) and (3.9)–(3.11), we have It follows from (3.9) and the properties of the upper limit of that there exists a sequence so that and Let , then we have Noting that , we can easily obtain that that is, . By [27, Theorem 8.3.2], we have , which is a contradiction. Thus, , this completes the proof of the theorem.
Theorem 3.2 implies that under the assumptions of Theorem 3.1, system (2.1) is a dissipative system. Associated with Lemma 2.5 and Theorem 3.2, we can easily deduce the sufficient conditions for the existence of global periodic attractor of system (2.1) as follows.
Proof. Obviously, system (2.1) can generate an -periodic
process by Lemma 2.5 and Theorem 3.2. Moreover, together with the assumptions of
Theorem 3.1, we can conclude that there exists -periodic
solution denoted by .
Let be an arbitrary solution of model (2.1) and use the substitution , the neural network model (2.1) can be rewritten as where , , ,
Associated with system (3.15), (A1–A4), and Lemma 2.8, we have where Repeating the above arguments, it is easy to see that system (3.15) satisfies all conditions of Theorem 3.2 with . Thus, , which implies that the periodic solution is globally attracting. Thus, we complete the proof of this theorem.
Remark 3.4. By Theorem 3.3, all other solutions of system (2.1) converge to the -periodic solution as . Theorem 3.3 provides a guideline on the choice of interconnection matrices for a neural network which is desired to have a unique global attractor. Moreover, if the parameters and are not periodic in time , under the assumptions of (A1)–(A4) and for all , Theorems 3.1 and 3.2 are also achievable.
Proof. According to Theorem 3.3, we only prove that of the system
(3.15) is globally exponentially stable. Consider the function given by By (3.17), we have and is continuous, as . Thus, there exists a so that . Without loss of generality, set , so when . Now let , when , we have , that is, Let be a solution
of system (3.15) with any initial value , then we have for and . Set Then, it follows from
(3.21) and (3.22) that for and , for
For the initial value , there must exist and so that and for
We will show that for any sufficiently small constant , If it is not true, there must exist some and so that By (3.23) and (3.20), we have which is a contradiction. Thus (3.24) holds. Let , we have that is, This implies that there exists so that From (3.22), there exists some constant so that which implies that the solution of the system (3.15) is globally exponentially stable. The proof of Theorem 3.5 is completed.
Proof. Let . It can be directly obtained by repeating the argument of Theorem 3.5. Here we omit it.
Remark 3.7. is called
exponentially convergent rate which can be estimated by (3.20). Global
exponential stability means that system (2.1) can converge to the periodic
solution associated with the inputs when the external inputs are provided to
the system, irrespective of the initial values. The conditions of the above
theorems and Corollary 3.6 are dependant on the spectral radius which needs for and . They may be invalid when and is not
identically zero. Therefore, we present other criteria to overcome the
Now we consider the following system:(A6)Function is continuous and there exist continuous positive -periodic functions so that
Theorem 3.8. Assume that for all and (A1), (A3), (A4), (A6) hold and (A7), where , then system (3.32) has exactly one -periodic solution.
Proof. By (A7), is an -matrix .
Therefore, there exists a diagonal matrix so that is strictly
diagonally dominant with positive diagonal entries, that is, Associated with systems (A1), (A3), (A4), (A6), and Lemma 2.8, we
have If , we have
Let , is a continuous -periodic function on . With the norm , then is a Banach space. Define operator as follows: where for and Clearly, is an -periodic solution of system (3.32) if and only if is the fixed point of operator . For all , we have In view of (3.35), it is easily concluded that . This implies that is a contraction operator. By the well-known contraction mapping principle, there exists a unique fixed point for which is the unique -periodic solution of system (3.32). The proof of Theorem 3.8 is completed.
Obviously, Cohen-Grossberg-type neural networks, cellular neural networks, and so on are special forms of system (2.1). It is not difficult to see that the conditions in our results are a little generation and improvement of their theorems; see [21, 24].
Corollary 3.9. Assume that (A3) and (A4) hold. Moreover, (A8)(A9), where and ; set , and , then (i) the set is a positively invariant set of system (3.41); (ii) the set is a globally attracting set of system (3.41); (iii) there exists a global attractor of system (3.41), moreover, it is -periodic and belongs to the positively invariant set .
Corollary 3.10. Under the assumptions of Corollary 3.9, the -periodic solution of system (3.41) is globally exponentially stable if the following condition holds: It easily follows from Theorem 3.5, so the proof is omitted.
4. Two Examples
In this section, we give two examples to demonstrate the efficiencies of our criteria.
Example 4.1. In the following, we consider the periodic solutions of the two neurons fuzzy networks with delay: where , and are continuously periodic solutions with period . Let By simple computation, one can easily obtain where denotes the eigenvalues of matrix . Therefore, . By Corollary 3.9, we know that the set is a positively invariant set of system (3.41), which is a globally attracting set of system (3.41), that is, there exists a global attractor of system (3.41). Moreover, it is -periodic and belongs to the positively invariant set . In addition, one can easily obtain According to Corollary 3.10, the -periodic solution of system (3.41) is globally exponentially stable.
Example 4.2. We consider the periodic solutions of the following
fuzzy Cohen-Grossberg neural networks with delay: where , and are
continuously periodic solutions with period . Let
By simple computation, we can obtain that where denotes the eigenvalues of matrix . Therefore, . By Theorem 3.1, we know that the set is a positively invariant set of system (4.5). According to Theorems 3.2 and 3.3, one can derive that the set is a globally attracting set of system (4.5), that is, there exists a global attractor of system (4.5). Moreover, it is -periodic and belongs to the positively invariant set . In addition, one can easily obtain According to Theorem 3.5, the -periodic solution of system (4.5) is globally exponentially stable.
Periodic oscillation is an important dynamical behavior in the applications and theories of neural networks. We derived the novel criteria of fuzzy Cohen-Grossberg neural networks with variable coefficients and distributed delays under easily checkout conditions and the minimal assumption for the feedback kernel. Our results extend some previous works and the way is common and very concise. In the mean time, we believe that the method can be applied to research other problems under some appropriate assumptions such as almost periodic solution, synchronization of neural networks, and so on.
This work was jointly supported by the National Natural Science Foundation of China under Grant no. 60574043, the Natural Science Foundation of Jiangsu Province of China under Grant no. BK2006093, the National Science Foundation of Hunan Provincial Education Department under Grant no. 06C792 and no. 07C700, and the Foundation of Professor Project of Xiangnan University.
- M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Transactions on Systems, Man and Cybernetics, vol. 13, no. 5, pp. 815–826, 1983.
- J. Cao and X. Li, “Stability in delayed Cohen-Grossberg neural networks: LMI optimization approach,” Physica D. Nonlinear Phenomena, vol. 212, no. 1-2, pp. 54–65, 2005.
- W. Xiong and J. Cao, “Exponential stability of discrete-time Cohen-Grossberg neural networks,” Neurocomputing, vol. 64, pp. 433–446, 2005.
- J. Cao and J. Liang, “Boundedness and stability for Cohen-Grossberg neural network with time-varying delays,” Journal of Mathematical Analysis and Applications, vol. 296, no. 2, pp. 665–685, 2004.
- Q. Song and J. Cao, “Stability analysis of Cohen-Grossberg neural network with both time-varying and continuously distributed delays,” Journal of Computational and Applied Mathematics, vol. 197, no. 1, pp. 188–203, 2006.
- W. Xiong and J. Cao, “Absolutely exponential stability of Cohen-Grossberg neural networks with unbounded delays,” Neurocomputing, vol. 68, pp. 1–12, 2005.
- K. Yuan and J. Cao, “An analysis of global asymptotic stability of delayed Cohen-Grossberg neural networks via nonsmooth analysis,” IEEE Transactions on Circuits and Systems. I. Regular Papers, vol. 52, no. 9, pp. 1854–1861, 2005.
- T. Chen and L. Rong, “Delay-independent stability analysis of Cohen-Grossberg neural networks,” Physics Letters. A, vol. 317, no. 5-6, pp. 436–449, 2003.
- L. Wang and X. F. Zou, “Exponential stability of Cohen-Grossberg neural networks,” Neural Networks, vol. 15, no. 3, pp. 415–422, 2002.
- L. Wang and X. Zou, “Harmless delays in Cohen-Grossberg neural networks,” Physica D. Nonlinear Phenomena, vol. 170, no. 2, pp. 162–173, 2002.
- H. Ye, A. N. Michel, and K. Wang, “Qualitative analysis of Cohen-Grossberg neural networks with multiple delays,” Physical Review E, vol. 51, no. 3, pp. 2611–2618, 1995.
- J. Cao and Q. Song, “Stability in Cohen-Grossberg-type bidirectional associative memory neural networks with time-varying delays,” Nonlinearity, vol. 19, no. 7, pp. 1601–1617, 2006.
- X. F. Liao, C. G. Li, and K. W. Wong, “Criteria for exponential stability of Cohen-Grossberg neural networks,” Neural Networks, vol. 17, no. 10, pp. 1401–1414, 2004.
- Q. Song and J. Cao, “Impulsive effects on stability of fuzzy Cohen-Grossberg neural networks with time-varying delays,” IEEE Transactions on Systems, Man, and Cybernetics, Part B, vol. 37, no. 3, pp. 733–741, 2007.
- H. Jiang, J. Cao, and Z. Teng, “Dynamics of Cohen-Grossberg neural networks with time-varying delays,” Physics Letters A, vol. 354, no. 5-6, pp. 414–422, 2006.
- T. Yang, L. B. Yang, C. W. Wu, and L. O. Chua, “Fuzzy cellular neural networks: theory,” in Proceedings of the 4th IEEE International Workshop on Cellular Neural Networks and Applications (CNNA '96), vol. 6, pp. 181–186, Seville, Spain, June 1996.
- T. Yang, L. B. Yang, C. W. Wu, and L. O. Chua, “Fuzzy cellular neural networks: applications,” in Proceedings of the 4th IEEE International Workshop on Cellular Neural Networks and Applications, vol. 6, pp. 225–230, Seville, Spain, June 1996.
- T. Yang and L.-B. Yang, “The global stability of fuzzy cellular neural network,” IEEE Transactions on Circuits and Systems, vol. 43, no. 10, pp. 880–883, 1996.
- Y. Liu and W. Tang, “Exponential stability of fuzzy cellular neural networks with constant and time-varying delays,” Physics Letters. A, vol. 323, no. 3-4, pp. 224–233, 2004.
- T. Huang, “Exponential stability of fuzzy cellular neural networks with distributed delay,” Physics Letters. A, vol. 351, no. 1-2, pp. 48–52, 2006.
- K. Yuan, J. Cao, and J. Deng, “Exponential stability and periodic solutions of fuzzy cellular neural networks with time-varying delays,” Neurocomputing, vol. 69, no. 13–15, pp. 1619–1627, 2006.
- S. Wang and M. Wang, “A new detection algorithm based on fuzzy cellular neural networks for white blood cell detection,” IEEE Transactions on on Information Technology in Biomedicine, vol. 10, no. 1, pp. 5–10, 2006.
- K. Gopalsamy and X. Z. He, “Stability in asymmetric Hopfield nets with transmission delays,” Physica D. Nonlinear Phenomena, vol. 76, no. 4, pp. 344–358, 1994.
- J. Cao and T. Chen, “Globally exponentially robust stability and periodicity of delayed neural networks,” Chaos, Solitons and Fractals, vol. 22, no. 4, pp. 957–963, 2004.
- S. Guo and L. Huang, “Periodic oscillation for a class of neural networks with variable coefficients,” Nonlinear Analysis. Real World Applications, vol. 6, no. 3, pp. 545–561, 2005.
- J. Hale and S. M. Verduyn Lunel, Introduction to Functional Differential Equations, vol. 99 of Applied Mathematical Sciences, Springer, New York, NY, USA, 1993.
- J. P. LaSalle, The Stability of Dynamical Systems, SIAM, Philadelphia, Pa, USA, 1976.
- R. A. Horn and C. R. Johnson, Matrix Analysis, Cambridge University Press, Cambridge, UK, 1990.
- A. Berman and R. J. Plemmons, Nonnegative Matrices in the Mathematical Sciences, Academic Press, New York, NY, USA, 1979.