About this Journal Submit a Manuscript Table of Contents
Discrete Dynamics in Nature and Society
Volume 2013 (2013), Article ID 917835, 11 pages
http://dx.doi.org/10.1155/2013/917835
Research Article

Multistability and Multiperiodicity for a General Class of Delayed Cohen-Grossberg Neural Networks with Discontinuous Activation Functions

1Institute of Applied Mathematics, Shijiazhuang Mechanical Engineering College, Shijiazhuang 050003, China
2Training Department, Shijiazhuang Mechanical Engineering College, Shijiazhuang 050003, China

Received 21 March 2013; Revised 22 May 2013; Accepted 22 May 2013

Academic Editor: Seenith Sivasundaram

Copyright © 2013 Yanke Du et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A general class of Cohen-Grossberg neural networks with time-varying delays, distributed delays, and discontinuous activation functions is investigated. By partitioning the state space, employing analysis approach and Cauchy convergence principle, sufficient conditions are established for the existence and locally exponential stability of multiple equilibrium points and periodic orbits, which ensure that -dimensional Cohen-Grossberg neural networks with -level discontinuous activation functions can have equilibrium points or periodic orbits. Finally, several examples are given to illustrate the feasibility of the obtained results.

1. Introduction

In recent years, great attention has been paid to the neural networks due to their applications in many areas such as signal processing, associative memory, pattern recognition, parallel computation, and optimization. It should be pointed out that the successful applications heavily rely on the dynamic behaviors of neural networks. Stability, as one of the most important properties for neural networks, is crucially required when designing neural networks. As models of human brains, neural networks have memory function that is, the state of the present time is related to the state of the past. Time delay provides information of history. In addition, neural networks have recently been implemented on electronic chips. In electronic implementation of neural networks, time delays are unavoidably encountered during the processing and transmission of signals. As in many other dynamical systems, it is well known that delays may result in oscillation and instability. Hence, to study delayed neural networks, one must address the problem of how to remove this destabilizing effect. There have been a great number of results on stability of delayed neural networks reported in the past two decades. Most of them focus on the uniqueness and global stability (or attractiveness) of the equilibrium, and periodic (or almost-periodic) trajectory of the delayed neural networks. Readers can refer to [16] and many others. On the other hand, in the applications of associative memory storage, pattern recognition, and decision making, the addressable memories or patterns or decisions are stored as stable equilibria or stable periodic orbits. So it is necessary that there exist multiple stable equilibria or periodic orbits for neural networks, which are usually referred to as multistability or multiperiodicity, respectively. In the last few years, the multistability and multiperiodicity of neural networks have been reported in [726] and the references therein. In particular, [1421] investigated the multistability or multiperiodicity of neural networks with sigmoidal activation functions or nondecreasing saturated activation functions. In [22], sufficient conditions of the multistability were presented for a class of neural networks with Mexican-hat-type activation functions. [23] studied some multistability properties for a class of bidirectional associative memory recurrent neural networks with unsaturating piecewise linear transfer functions. In order to increase storage capacity, [24, 25] investigated the multistability of two classes of neural networks with piecewise linear activation functions. Stability of multiple equilibria of neural networks with time-varying delays and concave-convex characteristics was formulated in [26]. Note that most of the results above were based on the assumption that the activation functions are continuous, whereas, as mentioned in [27] the neural networks with discontinuous activation functions are important and frequently encountered in applications when dealing with dynamical systems possessing high-slope nonlinear elements. For this reason, much effort has been devoted to analyzing the dynamic behavior of neural networks with discontinuous activation functions (see [2831]). In [29], the authors considered a recurrent neural network with a special class of discontinuous activation functions which is piecewise constants in the state space. In [30, 31], the multistability issues were discussed for two classes of recurrent neural networks with -level discontinuous activation functions. Sufficient conditions are established to ensure the existence of locally exponentially stable equilibria.

We point out that the majority of the references mentioned above studied the Hopfield neural networks. As ones of the important neural networks, Cohen-Grossberg neural networks (CGNNs) include a lot of famous ecological systems and neural networks as special cases such as the Lotka-Volterra system, the Gilpia-Analg competition system, the Eingen-Schuster system, and the Hopfield neural networks, cellular neural networks, and other neural networks. CGNNs have aroused a tremendous surge of investigation in these years. On the other hand, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. It is desired to model them by introducing continuously distributed delays over a certain duration of time such that the distant past has less influence compared with the recent behavior of the state, whereas, to the best of our knowledge, the multistability and multiperiodicity issues are seldom considered for the CGNNs with distributed delays and discontinuous activation functions.

Motivated by the works of [30, 31] and the discussions above, the purpose of this paper is to explore the multistability and multiperiodicity of CGNNs with time-varying delays, distributed delays, and discontinuous activation functions. To this end, we consider the following differential equations: where denotes the state of the th neuron at time ; denotes the amplification function; ; denotes the activation function; denotes the time-varying delay associated with the th neuron, which is variable with time due to the finite switching speed of amplifiers and satisfies ; represents the delay kernel function, which is a real-valued continuous function; , , and denote the connection strengths; is the external input bias.

The characteristics of activation functions have a major impact on the existence and stability of stationary patterns for neural networks. In this paper, we consider the following -level discontinuous activation functions: where ; is an integer satisfying ; , are constants; and .

The organization of this paper is as follows. In Section 2, some preliminaries are given. In Section 3, through decomposing state space into multiple positively invariant sets, we derive sufficient conditions for the existence and exponential stability of equilibria in any designated region. As an extension, similar results are presented for delayed CGNNs with periodic time-varying delays and external inputs in Section 4. In Section 5, four numerical examples are provided to illustrate the feasibility of the obtained results. A concluding remark is given in Section 6 to end this work.

2. Preliminaries

Denote . Let be the space of all continuous functions mapping into with the norm defined by , where . Denote as the vector norm of . For any function , we define , , and . We denote by the solution of system (1) with initial condition , . Throughout this paper, we make the following assumptions.  (H1) The amplification function is continuous, and there exist constants and such that  (H2) Each kernel function is continuous and satisfies Moreover, when , .

Since is allowed to have points of discontinuity, we consider the solution of system (1) in the sense of Filippov [32].

Definition 1 (Forti et al. [33]). A function , is a solution of system (1) on if (1) is continuous on and absolutely continuous on ; (2)there exists a measurable function , such that for almost all (a.a.) , and for a.a. , where represents the closure of the convex hull of .

Definition 2 (Forti et al. [33]). For any continuous function , and any measurable selection , such that for a.a. , by an initial value problem associated to system (1) with initial condition , , we mean the following problem: find a couple of functions , such that is a solution of system (1) on for some , is an output associated to , and where represents the closure of the convex hull of .

Definition 3. Let be a subset of . is said to be a positively invariant set of system (1) if and only if for any initial condition , we have for all .

Definition 4. An equilibrium of system (1) is said to be locally exponentially stable in , if there exist constants , and such that where initial condition . When , is said to be globally exponentially stable.

Definition 5. A periodic orbit of system (1) is said to be locally exponentially stable in , if there exist constants , and such that where initial condition and , . When , is said to be globally exponentially stable.

Denote where equals or    and . It is easy to see that is composed of parts.

3. Multistability for the CGNNs

In this section, we consider the existence and exponential stability of equilibrium points for system (1) in each division region.

Theorem 6. Under assumptions (H1) and (H2), for each division region , is a positively invariant set of system (1), if the following conditions are satisfied: where , , and .

Proof. From (10), it is readily seen that there exists a small constant such that for all , Denote where equals or and . Obviously, there exists a bijection between and , and is the limit set of as . Consequently, we only need to prove that for all , it is a positively invariant set of system (1).
For each , we consider any solution of system (1) with initial condition . We claim that the solution for all . If this is not true, we need to discuss the following three cases.
Case I. If and there exists a such that , and for , we get from (1) and (11) that which yields a contradiction to . Therefore, for all .
Case II. If    and there exists a such that either ,   and for or ,   and for . For the first subcase, we have which leads to a contradiction with . Similarly, one can prove the second subcase. Therefore, for all .
Case III. If and there exists a such that , and for , we obtain that which is a contradiction. Therefore, for all .
From Case I to Case III, we know that for all that is, is a positively invariant set of system (1). The proof is complete.

Theorem 7. Assume that assumptions (H1)and (H2) and the inequalities in (10) hold. For each division region , system (1) has a unique equilibrium located in which is locally exponentially stable.

Proof. Let and be any two solutions of system (1) with , , respectively. By the positive invariance of , we get that , for all . For , define Calculating the derivative of along the trajectories of system (1), we obtain that Since we have It follows from (17) and (19) that Hence, which yields that Inequality (22) shows that exponentially converges to as . In particular, for any fixed , denote . Then is also a solution of system (1). Thus, as for any , which indicates that is a Cauchy sequence when is large enough. By Cauchy convergence principle, approaches a constant vector which is a solution of system (1). From (22) and the invariance of , we know that is the unique equilibrium of system (1) which is locally exponentially stable in . The proof is complete.

If , system (1) reduces to the following Hopfield neural networks:

Corollary 8. Assume that assumption (H2) and the inequalities in (10) hold. For each division region , system (23) has a unique equilibrium located in which is locally exponentially stable.

Remark 9. It is noted that the number of such region is in . Hence, under conditions of Theorem 7, there exist locally exponentially stable equilibrium points for system (1) with activation functions (2). Neural networks with this class of activation functions can store many more patterns than those with sigmoidal activation functions and nondecreasing activation functions with saturation (see, e.g., [1421]) in practical applications.

Remark 10. An important application of multistability of recurrent neural networks is to implement pattern memory (see [34, 35]). A recalling probe, which is sufficiently close to the pattern to be retrieved, is set as an initial state and the state variables converge to the locally stable equilibrium point, which corresponds to the pattern to be retrieved. Hence, our results are useful for associative memories since they provide new criteria to guarantee the coexistence of encoded patterns and their local attractivity.

Remark 11. In [30], the authors investigated the multistability for a class of Hopfield neural networks with time-varying delays and discontinuous activation functions. By applying the principle of compressed mapping, sufficient conditions are established for the existence of multiple equilibrium points. In contrast, our multistability results are derived by means of analysis approach and Cauchy convergence principle. When , , and , system (1) reduces to system in [30]. Denote , in (2). Then the inequalities in (10) are equivalent to where and . We see that the main result Theorem  2 in [30] is a special case of Theorem 7 in our paper. Therefore, the obtained results in this paper improve and generalize those in [30].

Remark 12. We relax the restriction on the discontinuous activation functions adopted in [30] which are required to be nondecreasing. Theoretically, the activation functions (2) in our paper may be nondecreasing or nonincreasing. More precisely, we can deduce from (10) that

Theorem 13. Under assumptions (H1) and (H2), for each division region , system (1) has a unique equilibrium located in which is globally exponentially stable, if the following conditions are satisfied: where , , and .

Proof. Let be any solution of (1) with initial condition . If and , we obtain from (1) and (26) that If    and or , we have or If and , we get Therefore, will go into and stay in ; that is, there exists a such that for all . For any other solution of system (1) with initial condition , there exists a such that for all . Similar to the proof of Theorem 7, we can show that and region is a positively invariant set. Hence, system (1) has a unique equilibrium located in which is globally exponentially stable. The proof is complete.

4. Multiperiodicity for the CGNNs

Consider the following CGNNs in which the time-varying delays and external inputs are periodic: where and for all .

Theorem 14. Under assumptions (H1) and (H2), for each division region , is a positively invariant set of system (32), and system (32) has a unique periodic orbit located in which is locally exponentially stable, if the following conditions are satisfied: where , , and .

Proof. Similar to the proof of Theorem 6, we can prove that each division region is a positively invariant set of system (32). Hence, for any initial condition , we have for any . Define a Poincar mapping by . Then and . We can choose a positive integer such that for all . From (22), we have , where . This inequality implies that there exists a unique fixed point such that . Note that . It means that is also a fixed point of ; then , that is, . Obviously, if is a solution of system (32), is also a solution of system (32) and for all . Therefore, for all . This shows that is a periodic orbit of (32) located in . The locally exponential stability can be similarly proved by Theorem 7. This completes the proof.

5. Illustrative Examples

In this section, four examples are presented to illustrate our main results derived in Sections 3 and 4.

Example 1. Consider a 2-neuron delayed CGNN with 3-level discontinuous activation function; that is, in system (1), we choose

For simplicity, we choose the activation functions as

It is not difficult to verify that system (1) with (34) and (35) satisfies assumptions (H1) and (H2) and the inequalities in (10). According to Theorems 6 and 7, system (1) has nine locally exponentially stable equilibrium points:

The dynamics of this system are depicted in Figure 1(a), where evolutions of 120 initial conditions have been tracked.

fig1
Figure 1: Transient behaviors of the state variables and with different initial values in Example 1.

In system (32), let , , , and , and let the other parameters and functions be the same as those in (34) and (35). We can verify that system (32) satisfies assumptions (H1) and (H2) and the inequalities in (33). According to Theorem 14, system (32) has nine locally exponentially stable periodic orbits. Figure 1(b) shows the phase view of state variable of system (32).

Example 2. Consider a 3-neuron delayed CGNN with 2-level discontinuous activation function. In system (1), we choose

We can verify that system (1) with (37) satisfies assumptions (H1) and (H2) and the inequalities in (10). According to Theorems 6 and 7, system (1) has eight locally exponentially stable equilibrium points: The dynamics of this system are depicted in Figure 2(a).

fig2
Figure 2: Transient behaviors of the state variables , , and with different initial values in Example 2.

In system (32), let , , , , and , and let the other parameters and functions be the same as those in (37). Then system (32) satisfies assumptions (H1) and (H2) and the inequalities in (33). According to Theorem 14, system (32) has eight locally exponentially stable periodic orbits (see Figure 2(b)).

Example 3. Consider a 2-neuron delayed CGNN with 4-level discontinuous activation function. In system (1), we choose

The activation functions are described by

We can verify that system (1) with (39) and (40) satisfies assumptions (H1) and (H2) and the inequalities in (10). According to Theorems 6 and 7, system (1) has sixteen locally exponentially stable equilibrium points (see Figure 3(a)).

fig3
Figure 3: Transient behaviors of the state variables and with different initial values in Example 3.

In system (32), let , , , and , and let the other parameters and functions be the same as those in (39) and (40). We can verify that system (32) satisfies assumptions (H1) and (H2) and the inequalities in (33). According to Theorem 14, system (32) has sixteen locally exponentially stable periodic orbits (see Figure 3(b)).

Example 4. For associative memories of neural networks, the desired memory patterns are usually represented by binary vectors. Design a CGNN to store eight patterns    shown in Figure 4 as stable memories (white = −1 and black = 1).

917835.fig.004
Figure 4: Eight desired memories patterns for Example 4.

We consider system (1) with (37) in Example 2. Transform

Then    are given in (38). According to Example 2, the corresponding memory patterns    are stored as stable equilibria of system (1) with (37) which can be retrieved when the initial patterns contain sufficient information (should fall in the attracting domains).

6. Conclusions

In this paper, we performed multistability and multiperiodicity analyses for a general class of CGNNs with time-varying delays, distributed delays, and discontinuous activation functions. Sufficient conditions were established to ensure that there are locally exponential stable equilibrium points for the -dimensional CGNNs with -level discontinuous activation functions. Also, we derived sufficient conditions for the existence and global exponential stability of equilibrium point for the system. As an extension of multistability, sufficient conditions were established to ensure that the system has locally exponentially stable periodic orbits when time-varying delays and external inputs are periodic. Numerical simulations demonstrated the main results. The existing multistability results in [30] were improved and extended. Our results provide new criteria to guarantee the coexistence of encoded patterns and their local attractivity and are useful for associative memories.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (11071254), the Natural Science Foundation of Hebei Province (A2013506012), and the Science Foundation of Mechanical Engineering College (YJJXM11004).

References

  1. J. Tian and S. Zhong, “Improved delay-dependent stability criterion for neural networks with time-varying delay,” Applied Mathematics and Computation, vol. 217, no. 24, pp. 10278–10288, 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. H. Wang, Q. Song, and C. Duan, “LMI criteria on exponential stability of BAM neural networks with both time-varying delays and general activation functions,” Mathematics and Computers in Simulation, vol. 81, no. 4, pp. 837–850, 2010. View at Publisher · View at Google Scholar · View at Scopus
  3. X. Zhang, S. Wu, and K. Li, “Delay-dependent exponential stability for impulsive Cohen-Grossberg neural networks with time-varying delays and reaction-diffusion terms,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 3, pp. 1524–1532, 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. X. Fu and X. Li, “LMI conditions for stability of impulsive stochastic Cohen-Grossberg neural networks with mixed delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 1, pp. 435–454, 2011. View at Publisher · View at Google Scholar · View at Scopus
  5. B. Zhou, Q. Song, and H. Wang, “Global exponential stability of neural networks with discrete and distributed delays and general activation functions on time scales,” Neurocomputing, vol. 74, no. 17, pp. 3142–3150, 2011. View at Publisher · View at Google Scholar · View at Scopus
  6. J. Pan and Y. Zhan, “On periodic solutions to a class of non-autonomously delayed reaction-diffusion neural networks,” Communications in Nonlinear Science and Numerical Simulation, vol. 16, no. 1, pp. 414–422, 2011. View at Publisher · View at Google Scholar · View at Scopus
  7. Z. Huang, S. Mohamod, and H. Bin, “Multiperiodicity analysis and numerical simulation of discrete-time transiently chaotic non-autonomous neural networks with time-varying delays,” Communications in Nonlinear Science and Numerical Simulation, vol. 15, no. 5, pp. 1348–1357, 2010. View at Publisher · View at Google Scholar · View at Scopus
  8. L. Wang, W. Lu, and T. Chen, “Multistability and new attraction basins of almost-periodic solutions of delayed neural networks,” IEEE Transactions on Neural Networks, vol. 20, no. 10, pp. 1581–1593, 2009. View at Publisher · View at Google Scholar · View at Scopus
  9. E. Kaslik and S. Sivasundaram, “Multiple periodic solutions in impulsive hybrid neural networks with delays,” Applied Mathematics and Computation, vol. 217, no. 10, pp. 4890–4899, 2011. View at Publisher · View at Google Scholar · View at Scopus
  10. E. Kaslik and S. Sivasundaram, “Impulsive hybrid discrete-time Hopfield neural networks with delays and multistability analysis,” Neural Networks, vol. 24, no. 4, pp. 370–377, 2011. View at Publisher · View at Google Scholar · View at Scopus
  11. E. Kaslik and S. Sivasundaram, “Multistability in impulsive hybrid Hopfield neural networks with distributed delays,” Nonlinear Analysis: Real World Applications, vol. 12, no. 3, pp. 1640–1649, 2011. View at Publisher · View at Google Scholar · View at Scopus
  12. T. Zhou, M. Wang, and M. Long, “Existence and exponential stability of multiple periodic solutions for a multidirectional associative memory neural network,” Neural Processing Letters, vol. 35, no. 2, pp. 187–202, 2012. View at Publisher · View at Google Scholar · View at Scopus
  13. Y. Huang and X. Zhang, “Multistability properties of linear threshold discrete-time recurrent neural networks,” International Journal of Information and Systems Sciences, vol. 7, no. 1, pp. 1–10, 2011. View at Publisher · View at Google Scholar
  14. J. Cao, G. Feng, and Y. Wang, “Multistability and multiperiodicity of delayed Cohen-Grossberg neural networks with a general class of activation functions,” Physica D, vol. 237, no. 13, pp. 1734–1749, 2008. View at Publisher · View at Google Scholar · View at Scopus
  15. Z. Huang, C. Feng, and S. Mohamad, “Multistability analysis for a general class of delayed Cohen-Grossberg neural networks,” Information Sciences, vol. 187, no. 1, pp. 233–244, 2012. View at Publisher · View at Google Scholar · View at Scopus
  16. X. Nie and Z. Huang, “Multistability and multiperiodicity of high-order competitive neural networks with a general class of activation functions,” Neurocomputing, vol. 82, no. 1, pp. 1–13, 2012. View at Publisher · View at Google Scholar · View at Scopus
  17. G. Huang and J. Cao, “Multistability in bidirectional associative memory neural networks,” Physics Letters A, vol. 372, no. 16, pp. 2842–2854, 2008. View at Publisher · View at Google Scholar · View at Scopus
  18. G. Huang and J. Cao, “Delay-dependent multistability in recurrent neural networks,” Neural Networks, vol. 23, no. 2, pp. 201–209, 2010. View at Publisher · View at Google Scholar · View at Scopus
  19. X. Nie and J. Cao, “Multistability of second-order competitive neural networks with nondecreasing saturated activation functions,” IEEE Transactions on Neural Networks, vol. 22, no. 11, pp. 1694–1708, 2011. View at Publisher · View at Google Scholar · View at Scopus
  20. W. Ding and L. Wang, “2N almost periodic attractors for Cohen-Grossberg-type BAM neural networks with variable coefficients and distributed delays,” Journal of Mathematical Analysis and Applications, vol. 373, no. 1, pp. 322–342, 2011. View at Publisher · View at Google Scholar · View at Scopus
  21. C. Cheng, K. Lin, and C. Shih, “Multistability and convergence in delayed neural networks,” Physica D, vol. 225, no. 1, pp. 61–74, 2007. View at Publisher · View at Google Scholar · View at Scopus
  22. L. Wang and T. Chen, “Multistability of neural networks with Mexican-hat-type activation functions,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 11, pp. 1816–1826, 2012. View at Publisher · View at Google Scholar
  23. L. Zhang, Z. Yi, J. Yu, and P. A. Heng, “Some multistability properties of bidirectional associative memory recurrent neural networks with unsaturating piecewise linear transfer functions,” Neurocomputing, vol. 72, no. 16–18, pp. 3809–3817, 2009. View at Publisher · View at Google Scholar · View at Scopus
  24. Z. Zeng, T. Huang, and W. X. Zheng, “Multistability of recurrent neural networks with time-varying delays and the piecewise linear activation function,” IEEE Transactions on Neural Networks, vol. 21, no. 8, pp. 1371–1377, 2010. View at Publisher · View at Google Scholar · View at Scopus
  25. L. Wang, W. Lu, and T. Chen, “Coexistence and local stability of multiple equilibria in neural networks with piecewise linear nondecreasing activation functions,” Neural Networks, vol. 23, no. 2, pp. 189–200, 2010. View at Publisher · View at Google Scholar · View at Scopus
  26. Z. Zeng and W. Zheng, “Multistability of neural networks with time-varying delays and concave-convex characteristics,” IEEE Transactions on Neural Networks and Learning Systems, vol. 23, no. 2, pp. 293–305, 2012. View at Publisher · View at Google Scholar
  27. M. Forti and P. Nistri, “Global convergence of neural networks with discontinuous neuron activations,” IEEE Transactions on Circuits and Systems I, vol. 50, no. 11, pp. 1421–1435, 2003. View at Publisher · View at Google Scholar · View at Scopus
  28. G. Huang and J. Cao, “Multistability of neural networks with discontinuous activation function,” Communications in Nonlinear Science and Numerical Simulation, vol. 13, no. 10, pp. 2279–2289, 2008. View at Publisher · View at Google Scholar · View at Scopus
  29. G. Bao and Z. Zeng, “Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions,” Neurocomputing, vol. 77, no. 1, pp. 101–107, 2012. View at Publisher · View at Google Scholar · View at Scopus
  30. Y. Huang, H. Zhang, and Z. Wang, “Dynamical stability analysis of multiple equilibrium points in time-varying delayed recurrent neural networks with discontinuous activation functions,” Neurocomputing, vol. 91, no. 15, pp. 21–28, 2012. View at Publisher · View at Google Scholar · View at Scopus
  31. Y. Huang, H. Zhang, and Z. Wang, “Multistability and multiperiodicity of delayed bidirectional associative memory neural networks with discontinuous activation functions,” Applied Mathematics and Computation, vol. 219, no. 3, pp. 899–910, 2012. View at Publisher · View at Google Scholar
  32. A. Filippov, “Differential equations with discontinuous righthand sides,” in Mathematics and Its Applications, Soviet Series, Kluwer Academic Publishers, Boston, Mass, USA, 1988.
  33. M. Forti, P. Nistri, and D. Papini, “Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1449–1463, 2005. View at Publisher · View at Google Scholar · View at Scopus
  34. Z. Zeng and J. Wang, “Associative memories based on continuous-time cellular neural networks designed using space-invariant cloning templates,” Neural Networks, vol. 22, no. 5-6, pp. 651–657, 2009. View at Publisher · View at Google Scholar · View at Scopus
  35. Z. Zeng and J. Wang, “Analysis and design of associative memories based on recurrent neural networks with linear saturation activation functions and time-varying delays,” Neural Computation, vol. 19, no. 8, pp. 2149–2182, 2007. View at Scopus