Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2013 (2013), Article ID 236189, 12 pages
http://dx.doi.org/10.1155/2013/236189
Research Article

Stability of Almost Periodic Solution for a General Class of Discontinuous Neural Networks with Mixed Time-Varying Delays

College of Information Science and Engineering, Yanshan University, Qinhuangdao 066004, China

Received 2 September 2013; Accepted 5 November 2013

Academic Editor: Huaiqin Wu

Copyright © 2013 Yingwei Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The global exponential stability issues are considered for almost periodic solution of the neural networks with mixed time-varying delays and discontinuous neuron activations. Some sufficient conditions for the existence, uniqueness, and global exponential stability of almost periodic solution are achieved in terms of certain linear matrix inequalities (LMIs), by applying differential inclusions theory, matrix inequality analysis technique, and generalized Lyapunov functional approach. In addition, the existence and asymptotically almost periodic behavior of the solution of the neural networks are also investigated under the framework of the solution in the sense of Filippov. Two simulation examples are given to illustrate the validity of the theoretical results.

1. Introduction

When applying neural networks to solve many practical problems in optimization, control, and signal processing, neural networks are usually designed to be globally asymptotical or exponentially stable to avoid spurious responses or the problems of local minima. Hence exploring the global stability of the neural networks is of primary importance. In recent years, neural networks with discontinuous activations, as a special class of dynamical systems described by differential equations with discontinuous right-hand sides, have been found useful to address a number of interesting engineering tasks, such as dry friction, impacting machines, switching in electronic circuits, systems oscillating under the effect of an earthquake, and control synthesis of uncertain systems, and therefore have received extensive attention from a lot of scholars so far; see, for example, [129], and references therein. In [1], Forti and Nistri firstly dealt with the global asymptotic stability (GAS) and global convergence in finite time of a unique equilibrium point for the neural networks modeled by differential equations with discontinuous right-hand sides, and by using Lyapunov diagonally stable (LDS) matrix and constructing suitable Lyapunov function, several stability conditions were derived. In the subsequent literature, considerable efforts have been devoted to investigate the issue of stability for the neural network systems with discontinuous activation functions. In [2, 3], by applying generalized Lyapunov approach and -matrix, Forti et al. discussed the global exponential stability (GES) of the neural networks with discontinuous or non-Lipschitz activation functions. Arguing as in [1, 4], Lu and Chen dealt with GES and GAS of Cohen-Grossberg neural networks with discontinuous activation functions. In [59], under the framework of Filippov solutions, by using differential inclusion and Lyapunov approach, a series of results had been obtained for the global stability of the unique equilibrium point of the neural networks with a single constant time-delay and discontinuous activations. In [10], by using differential inclusion and constructing Lyapunov functional, Liu et al. discussed the global stability of the unique equilibrium point for the neural networks with mixed time-varying delays and discontinuous neuron activations, and some stability results were given in terms of certain LMIs. In [11], Nie and Cao studied the existence and global stability of equilibrium point for time-varying delayed competitive neural networks with discontinuous activation functions. In [12, 13], based on Lyapunov stability theory and matrix inequality analysis techniques, the authors analyzed the global robust stability of a unique equilibrium point for the neural networks with time-varying delays and discontinuous neuron activation functions. In [14], Liu and Cao discussed the robust state estimation problems for time-varying delayed neural networks with discontinuous activation functions via differential inclusions, and some criteria had been established to guarantee the existence of robust state estimator. In [15], under the framework of Filippov solutions, by using matrix measure approach, Liu et al. investigated the global dissipativity and quasisynchronization issues for the time-varying delayed neural networks with discontinuous activations and parameter mismatches.

It is well known that any equilibrium point can be regarded as a special case of periodic solution for a neuron system with arbitrary period or zero amplitude. Through the study on periodic solution, more general results can be obtained than those of the study on equilibrium point for a neuron system. Hence to study the global stability of the equilibrium point of the neural networks with discontinuous activation functions, much attention has been paid to deal with the stability of periodic solution for various neural network systems with discontinuous activations; see, for example, [1728]. In [1821], under the Filippov inclusion framework, by using Leray-Schauder alternative theorem and Lyapunov approach, the authors presented the conditions of the existence and GES or GAS of a unique periodic solution for Hopfield neural networks or BAM neural networks with discontinuous activation functions. In [22, 23], Wu et al. discussed the existence and GES of the unique periodic solution for the neural networks with discontinuous activation functions under impulsive control. In [25, 26], under the framework of Filippov solutions, by using Lyapunov approach and -matrix, the authors presented the stability results of periodic solution for delayed Cohen-Grossberg neural networks with a single constant time-delay and discontinuous activation functions. In [27, 28], the authors explored the periodic dynamical behavior of the neural networks with time-varying delays and discontinuous activation functions. Some conditions were proposed to ensure the existence and GES of the unique periodic solution.

It should be noted that all the results reported in the literature above are concerned with the issue of stability analysis of equilibrium point or periodic solution and neglect the effects of almost periodicity for the neural networks with discontinuous activation functions. As far as we know, the almost periodicity is one of the basic properties for dynamical neural systems and appears to retrace their paths through phase space but not exactly. Almost periodic functions, with a superior spatial structure, can be regarded as a generalization of periodic functions. As pointed out by [29, 30], in practice, almost periodic phenomenon was more common than periodic phenomenon, and almost periodic oscillatory behavior was more accordant with reality. In the past few years, a number of results had been obtained for the global stability of the almost periodic solution of various neural networks with continuous activation functions; see, for instance, [3139] and references therein. In [31], Ding and Wang presented the results of the existence and stability of almost periodic attractors for Cohen-Grossberg-type BAM neural networks with variable coefficients. In [37], Wang et al. investigated the multistability of delayed neural networks and described new attraction basins of almost-periodic solutions. Very recently, under the framework of the theory of Filippov differential inclusions, in [29], Allegretto et al. proved the common asymptotic behavior of almost periodic solution for discontinuous, delayed, and impulsive neural networks. In [30, 40], Lu and Chen and Qin et al. discussed the existence and uniqueness of an almost periodic solution (as well as its global exponential stability) of delayed neural networks with almost periodic coefficients and discontinuous activations. It should be pointed out that, the network model explored in [29, 30, 40] was a class of discontinuous neural networks with a single constant time-delay, and the stability conditions were achieved by using LDS matrix or -matrix. Compared with the stability conditions expressed in terms of LMIs, it was obvious that the results obtained in [29, 30, 40] were very conservative. To the authors' knowledge, very few works have been conducted to deal with the stability analysis issues for almost periodic solution of discontinuous neural networks with time-varying delays, particularly to study the global stability of almost periodic solution by applying LMIs analysis technique, which motivates the work of this paper.

In this paper, our aim is to study the delay-dependent exponential stability problem for almost periodic solution of the neural networks with mixed time-varying delays and discontinuous activation functions. Under the framework of Filippov differential inclusions, by applying the nonsmooth Lyapunov stability theory and highly efficient LMI approach, new delay-dependent sufficient conditions are presented to ensure the existence and GES of almost periodic solution in terms of LMIs, which can be solved efficiently by using recently developed convex optimization algorithms [41]. Moreover, the obtained conclusion is applied to prove the existence and stability of periodic solution (or equilibrium point) for the neural networks with mixed time-varying delays and discontinuous activations.

For convenience, some notations are introduced as follows. denotes the set of real numbers, denotes the -dimensional euclidean space, and denotes the set of all real matrices. For any matrix , means that is positive definite (negative definite). denotes the inverse of . denotes the transpose of . and denote the maximum and minimum eigenvalue of , respectively. Given the vectors , , , and . denotes the 2-norm of ; that is, , where denotes the spectral radius of . For , denotes the family of continuous function from to with the norm . denotes the derivative of .

Given a set , denotes the closure of the convex hull of and denotes the collection of all nonempty, closed, and convex subsets of .

Let be a locally Lipschitz continuous function. Clarke’s generalized gradient [42] of at is defined by where is the set of Lebesgue measure zero where does not exist and is an arbitrary set with measure zero.

Let . A set-valued map is said to be measurable, if for all , -valued function is measurable. This definition of measurability is equivalent to saying that (graph measurability), where is the Lebesgue -field of and is the Borel -field of .

Let be Hausdorff topological spaces and . We say that the set-valued map is upper semicontinuous, if, for all nonempty closed subset of , is closed in .

The set-valued map is said to have a closed (convex, compact) image if, for each , is closed (convex, compact).

The rest of this paper is organized as follows. In Section 2, the model formulation and some preliminaries are given. In Section 3, the existence and asymptotically almost periodic behavior of Filippov solutions are analyzed. Moreover, the proof of the existence of almost periodic solution is given. The global exponential stability is discussed, and a delay-dependent criterion is established in terms of LMIs. In Section 4, two numerical examples are presented to demonstrate the validity of the proposed results. Some conclusions are drawn in Section 5.

2. Model Description and Preliminaries

In this paper, we consider a general class of the neural networks whose dynamics is described by the system of differential equation or equivalently the vector form where is the vector of neuron states at time ; is an diagonal matrix, and , is the neuron self-inhibition; , , and are real connection weight matrices representing the weighting coefficients of the neurons; , , represent the neuron input-output activations; is a real vector function representing the external inputs of the neuron at time ; the function and denote the discrete and distributed time-varying delays, respectively, satisfying The activation function satisfies the assumption that

: , , is piecewise continuous; that is, is continuous in except a countable set of jump discontinuous points and, in every compact set of , has only a finite number of jump discontinuous points;

, , is nondecreasing.

Under the assumption , is undefined at the points where is discontinuous, and , where , . The system (4) is a differential equation with discontinuous right-hand side. For the system (4), we adopt the following definition of the solution in the sense of Filippov [43].

Definition 1. A function , , is a solution of the system (4) on if(1) is continuous on and absolutely continuous on ;(2) satisfies where .

By the assumption , it is easy to check that : is an upper semicontinuous set-valued map with nonempty, compact, and convex values. Hence is measurable [44]. By the measurable selection theorem, if is a solution of the system (4), then there exists a measurable function such that and for a.a. .

The function in (7) is called an output solution associated with the state variable and represents the vector of neural network outputs.

Definition 2. For any continuous function and any measurable selection , such that for a.a. , an absolutely continuous function associated with a measurable function is said to be a solution of the initial value problem (IVP) for the system (4) on ( might be ) with the initial value , , if

Definition 3 (see [45]). A continuous function is said to be almost periodic on , if for any scalar there exist scalars and in any interval with the length of , such that for all .

Definition 4. The almost periodic solution of the system (4) is said to be globally exponentially stable, if there exist scalars and , such that where is the solution of the system (4) with the initial value , , and is called as the exponential convergence rate.

To obtain the main results of this paper, the following lemmas will be needed.

Lemma 5 (chain rule [42]). If is C-regular and is absolutely continuous on any compact interval of then and are differentials for a.a. , and

Lemma 6 (Jensen’s inequality). For any constant matrix , any scalars and with , and a vector function such that the integrals concerned as well defined,

Lemma 7. Let scalar . Given any and ,

Before proceeding to the main results, we further make the following assumptions.

: , , and are continuous functions and possess almost periodic property; that is, for any , there exist and in any interval with the length of , such that

: For any , , , there exists constant , such that

: There exist a positive definite diagonal matrix and two positive definite symmetric matrices , , such that

3. Main Results

In this section, the main results concerned with the stability of the almost periodic solution are addressed for a general class of neural networks (4).

Firstly, we give the proof of the existence of the solution and discuss the asymptotically almost periodic behavior of the solution for the system (4).

Proposition 8. If the assumption holds, then there exist a positive constant , a positive diagonal matrix , and two positive definite symmetric matrices and such that where and is the -dimensional identity matrix.

Proof. Let , , and . Choose a scalar ; then By Schur's complements, (17) is equivalent toThus, we can choose a sufficiently small positive constant , such that where . Let , , and ; then Proposition 8 is proved.

Theorem 9. If the assumptions , , and are satisfied, then the neural network system (4) has a solution of IVP on for any initial value , .

Proof. For any initial value , , similar to the proof of Lemma 1 in [2], under the assumptions , the system (4) has a local solution associated with a measurable function with initial value , on , where or and is the maximal right-side existence interval of the local solution.
Consider the following Lyapunov functional candidate: where , , , and are defined as in Proposition 8. By Lemma 5, by calculating the time derivative of along the local solution of the system (4) on , it yields Without loss of generality, we can suppose . In fact, if this is not the case, set , ; then the system (7) could be equivalently changed as where , for a.a. , and . It is obvious that . By , , and the assumption , we have Using Lemmas 6 and 7, we can obtain that where is defined as in Proposition 8. By the assumption , is bounded for . Hence there exists a constant such that This implies that Integrating both sides of (27) from to , , it follows that In view of the definition of in (17) and the fact that all the terms in are not negative, we have By combining (28) and (29), it is easy to obtain Therefore, . By viability theorem in differential inclusions theory [44], one yields . That is, the system (4) has a solution of IVP on for any initial value. The proof is completed.

Theorem 10. Suppose that the assumptions are satisfied; then any solution of the neural network system (4) is asymptotically almost periodic.

Proof. Let be a solution of IVP of the system (4) associated with a measurable function with initial value , . Set , we have where Consider a Lyapunov functional candidate as Calculating the time derivative of along trajectories of the system (31), similar to the proof of Theorem 9, we can get From the proof of Theorem 9, we can get that is bounded. Consequently, is also bounded. By the assumption , there exist positive constants and , such that Therefore, by using the assumption , it is easy to obtain that, for any , there exist and in any interval with the length of , such that This implies By combining (33) and (37), we have Therefore, there exists , such that, for any ,; that is, . This shows that any solution of the system (4) is asymptotically almost periodic. The proof is completed.

Remark 11. If discrete time-varying delay reduces to constant delay (i.e., ), then . Without making the assumption , from the proof above, it is easy to see that Theorem 10 can be also obtained. If both and reduce to constant delays (i.e., , ), then . In this case, Theorem 10 is also true.

Next, we will discuss the existence, uniqueness, and global exponential stability of the almost periodic solutions for the system (4).

Theorem 12. Suppose that the assumptions hold; then the neural network system (4) has a unique almost periodic solution which is globally exponentially stable.

Proof. Firstly, we prove the existence of the almost periodic solution for the system (4).
Under the assumptions of Theorem 12, for any initial value , , the neural network (4) has a solution which is asymptotically almost periodic. Let be any solution of the system (4) associated with a measurable function with the initial value , . Then for a.a. .
By using (36), we can pick a sequence satisfying and , for all , where is defined in (32). In addition, the sequence is equicontinuous and uniformly bounded. By the Arzela-Ascoli theorem and diagonal selection principle, we can select a subsequence of (still denoted by ), such that uniformly converges into an absolute continuous function on any compact set of .
On the other hand, since and are bounded by the boundedness of , the sequence is bounded. Hence we can also select a sub-sequence of (still denoted by ), such that converges into a measurable function for any . According to the fact that(i) is an upper semicontinuous set-valued map,(ii)for , as ,we can get that, for any , there exists , such that for and , where is an -dimensional unit ball. Hence the fact that implies that . On the other hand, since is a compact subset of , we have . Noting the arbitrariness of , it follows that for a.a. .
By Lebesgue's dominated convergence theorem, we can obtain for any and . This implies that is a solution of the system (4).
Notice that is asymptotically almost periodic; then, for any , there exist , , and in any interval with the length of , such that , for all . Therefore, there exists a constant , when and , for any . Let ; it follows that , for any . This shows that is a almost periodic solution of the system (4).
Consider the changes of variables , which transforms (4) into the differential equation where is measurable, and and is the output solution associated with .
Similar to in (17), define a Lyapunov functional candidate as Calculate the derivative of along the solution of the system (41). Arguing as in the proof of Theorem 9, we have where Combining (42) and (43) gives This implies that the almost periodic solution of the system (4) is globally exponentially stable. Consequently, the almost periodic solution of the system (4) is unique. The proof is completed.

Remark 13. Under the assumptions and , if the assumption changes as the assumption , there exists a positive diagonal matrix , such that the coefficient matrices , , and of the system (4) satisfy the following LMI:
then, the conclusion of Theorem 9 is still true. In fact, similar with the proof of Proposition 8, we can choose positive constants and , such that where .
Consider the following Lyapunov functional candidate: Under the assumptions and , calculating the time derivative of along the local solution of the system (4), we can obtain where This implies that the conclusion of Theorem 9 is still true.
In addition, under the case above, if the assumption is satisfied, then it is easy to get that the conclusions of Theorems 10 and 12 are also correct.

Notice that periodic function can be regarded as special almost periodic function. Hence based on Theorems 9 and 12, we can obtain the following corollary.

Corollary 14. Suppose that , , and are periodic functions; if the assumptions , , and (or ) are satisfied, then(1)the neural network system (4) has a solution of IVP on for any initial value , ;(2)the neural network system (4) has a unique periodic solution which is globally exponentially stable.

When is a constant external input , the system (4) changes as Since a constant function can be also regarded as a special almost periodic function, by applying Theorems 9 and 12 on the neural network system (51), we can obtain the following corollary.

Corollary 15. If the assumptions , and (or ) are satisfied, then(1)the neural network system (51) has a solution of IVP on for any initial value , ;(2)the neural network system (51) has a unique equilibrium point which is globally exponentially stable.

4. Illustrative Examples

Example 1. Consider the second-order neural network (4) with the following system parameters: Set , , , and . It is easy to check that assumptions hold, and , , and .
Solving the LMI in (15) by using appropriate LMI solver in the Matlab, the feasible positive definite diagonal matrix and positive definite matrices , could be as The assumption is also satisfied.
When the external input of the network is a periodic function, by Corollary 14, this neural network has a unique periodic solution which is globally exponentially stable.
Figures 1 and 2 display the state trajectories of this neural network with initial values , when . It can be seen that these trajectories converge into the unique periodic solution of the network. This is in accordance with the conclusion of Corollary 14.
When the external input of the network is constant, by Corollary 15, this neural network has a unique equilibrium point which is globally exponentially stable.
Figure 3 displays the state trajectories of this neural network with state initial value , , when . It can be seen that these trajectories converge into the unique equilibrium point of the network. This is in accordance with the conclusion of Corollary 15.

236189.fig.001
Figure 1: Time-domain behavior of the state variables and when .
236189.fig.002
Figure 2: Phase plane behavior of the state variables and when .
236189.fig.003
Figure 3: Time-domain behavior of the state variables and when .

Example 2. Consider the third-order neural network (4) with the following system parameters: Set , , , and . It is easy to check that assumptions hold, and , , and .
By solving the LMI in (15) by using appropriate LMI solver in the Matlab, the feasible positive definite diagonal matrix and positive definite matrices could be as The assumption is also satisfied.
When the external input of the network is a periodic function, by Corollary 14, this neural network has a unique periodic solution which is globally exponentially stable.
Figures 4 and 5 display the state trajectories of this neural network with state initial value , , when . It can be seen that these trajectories converge into the unique periodic of the network. This is in accordance with the conclusion of Corollary 14.
When the external input of the network is constant, by Corollary 15, this neural network has a unique equilibrium point which is globally exponentially stable.
Figure 6 displays the state trajectories of this neural network with initial values , , when . It can be seen that these trajectories converge into the unique equilibrium point of the network. This is in accordance with the conclusion of Corollary 15.

236189.fig.004
Figure 4: Time-domain behavior of the state variables , , and when .
236189.fig.005
Figure 5: Phase plane behavior of the state variables , , and when .
236189.fig.006
Figure 6: Time-domain behavior of the state variables , , and when .

5. Conclusion

In this paper, the almost periodic oscillation issue has been investigated for the neural networks with mixed time-varying delays and discontinuous activation functions. Some sufficient conditions which ensure the existence, uniqueness, and global exponential stability of almost periodic solution have been obtained in terms of LMIs, which is easy to be checked and applied in practice. Two numerical examples have been given to illustrate the effectiveness of the present results.

In [2], Forti et al. conjectured that all solutions of the delayed neural networks with discontinuous neuron activations and periodic inputs converge into an asymptotically stable limit cycle. In this paper, under the assumptions , the obtained results confirm that Forti's conjecture is true for the neural networks with mixed time-varying delays and discontinuous activation functions. In the near future, the stability issues of almost periodic solution for stochastic/impulsive neural networks with discontinuous activations will be expected to be solved.

Acknowledgments

This work was supported by the National Science and Technology Major Project of China (2011ZX05020-006) and the Natural Science Foundation of Hebei Province of China (A2011203103).

References

  1. M. Forti and P. Nistri, “Global convergence of neural networks with discontinuous neuron activations,” IEEE Transactions on Circuits and Systems I, vol. 50, no. 11, pp. 1421–1435, 2003. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  2. M. Forti, M. Grazzini, P. Nistri, and L. Pancioni, “Generalized lyapunov approach for convergence of neural networks with discontinuous or non-lipschitz activations,” Physica D, vol. 214, no. 1, pp. 88–99, 2006. View at Publisher · View at Google Scholar · View at Scopus
  3. M. Forti, “M-matrices and global convergence of discontinuous neural networks,” International Journal of Circuit Theory and Applications, vol. 35, no. 2, pp. 105–130, 2007. View at Publisher · View at Google Scholar · View at Scopus
  4. W. Lu and T. Chen, “Dynamical behaviors of Cohen-Grossberg neural networks with discontinuous activation functions,” Neural Networks, vol. 18, no. 3, pp. 231–242, 2005. View at Publisher · View at Google Scholar · View at Scopus
  5. M. Forti, P. Nistri, and D. Papini, “Global exponential stability and global convergence in finite time of delayed neural networks with infinite gain,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1449–1463, 2005. View at Publisher · View at Google Scholar · View at Scopus
  6. L. Li and L. Huang, “Global asymptotic stability of delayed neural networks with discontinuous neuron activations,” Neurocomputing, vol. 72, no. 16–18, pp. 3726–3733, 2009. View at Publisher · View at Google Scholar · View at Scopus
  7. S. Qin and X. Xue, “Global exponential stability and global convergence in finite time of neural networks with discontinuous activations,” Neural Processing Letters, vol. 29, no. 3, pp. 189–204, 2009. View at Publisher · View at Google Scholar · View at Scopus
  8. Y. Meng, L. Huang, Z. Guo, and Q. Hu, “Stability analysis of Cohen-Grossberg neural networks with discontinuous neuron activations,” Applied Mathematical Modelling, vol. 34, no. 2, pp. 358–365, 2010. View at Publisher · View at Google Scholar · View at MathSciNet
  9. L. Li and L. Huang, “Dynamical behaviors of a class of recurrent neural networks with discontinuous neuron activations,” Applied Mathematical Modelling, vol. 33, no. 12, pp. 4326–4336, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  10. J. Liu, X. Liu, and W.-C. Xie, “Global convergence of neural networks with mixed time-varying delays and discontinuous neuron activations,” Information Sciences, vol. 183, pp. 92–105, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  11. X. Nie and J. Cao, “Existence and global stability of equilibrium point for delayed competitive neural networks with discontinuous activation functions,” International Journal of Systems Science, vol. 43, no. 3, pp. 459–474, 2012. View at Publisher · View at Google Scholar · View at MathSciNet
  12. Z. Guo and L. Huang, “LMI conditions for global robust stability of delayed neural networks with discontinuous neuron activations,” Applied Mathematics and Computation, vol. 215, no. 3, pp. 889–900, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  13. X. Wu, Y. Wang, L. Huang, and Y. Zuo, “Robust exponential stability criterion for uncertain neural networks with discontinuous activation functions and time-varying delays,” Neurocomputing, vol. 73, no. 7–9, pp. 1265–1271, 2010. View at Publisher · View at Google Scholar · View at Scopus
  14. X. Liu and J. Cao, “Robust state estimation for neural networks with discontinuous activations,” IEEE Transactions on Systems, Man, and Cybernetics. Part B, vol. 40, no. 6, pp. 1425–1437, 2010. View at Google Scholar · View at Scopus
  15. X. Liu, T. Chen, J. Cao, and W. Lu, “Dissipativity and quasi-synchronization for neural networks with discontinuous activations and parameter mismatches,” Neural Networks, vol. 24, no. 10, pp. 1013–1021, 2011. View at Publisher · View at Google Scholar · View at Scopus
  16. G. Bao and Z. Zeng, “Analysis and design of associative memories based on recurrent neural network with discontinuous activation functions,” Neurocomputing, vol. 77, no. 1, pp. 101–107, 2012. View at Publisher · View at Google Scholar · View at Scopus
  17. X. Liu and J. Cao, “On periodic solutions of neural networks via differential inclusions,” Neural Networks, vol. 22, no. 4, pp. 329–334, 2009. View at Publisher · View at Google Scholar · View at Scopus
  18. L. Huang, J. Wang, and X. Zhou, “Existence and global asymptotic stability of periodic solutions for hopfield neural networks with discontinuous activations,” Nonlinear Analysis: Real World Applications, vol. 10, no. 3, pp. 1651–1661, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  19. H. Wu and Y. Li, “Existence and stability of periodic solution for BAM neural networks with discontinuous neuron activations,” Computers   Mathematics with Applications, vol. 56, no. 8, pp. 1981–1993, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  20. H. Wu, “Stability analysis for periodic solution of neural networks with discontinuous neuron activations,” Nonlinear Analysis: Real World Applications, vol. 10, no. 3, pp. 1717–1729, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  21. H. Wu, “Global stability analysis of a general class of discontinuous neural networks with linear growth activation functions,” Information Sciences, vol. 179, no. 19, pp. 3432–3441, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  22. H. Wu, X. Xue, and X. Zhong, “Stability analysis for neural networks with discontinuous neuron activations and impulses,” International Journal of Innovative Computing, Information and Control, vol. 3, no. 6B, pp. 1537–1548, 2007. View at Google Scholar · View at Scopus
  23. H. Wu and C. Shan, “Stability analysis for periodic solution of BAM neural networks with discontinuous neuron activations and impulses,” Applied Mathematical Modelling, vol. 33, no. 6, pp. 2564–2574, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  24. D. Papini and V. Taddei, “Global exponential stability of the periodic solution of a delayed neural network with discontinuous activations,” Physics Letters A, vol. 343, no. 1–3, pp. 117–128, 2005. View at Publisher · View at Google Scholar · View at Scopus
  25. X. Chen and Q. Song, “Global exponential stability of the periodic solution of delayed Cohen-Grossberg neural networks with discontinuous activations,” Neurocomputing, vol. 73, no. 16–18, pp. 3097–3104, 2010. View at Publisher · View at Google Scholar · View at Scopus
  26. X. He, W. Lu, and T. Chen, “Nonnegative periodic dynamics of delayed Cohen-Grossberg neural networks with discontinuous activations,” Neurocomputing, vol. 73, no. 13–15, pp. 2765–2772, 2010. View at Publisher · View at Google Scholar · View at Scopus
  27. Z. Cai and L. Huang, “Existence and global asymptotic stability of periodic solution for discrete and distributed time-varying delayed neural networks with discontinuous activations,” Neurocomputing, vol. 74, no. 17, pp. 3170–3179, 2011. View at Publisher · View at Google Scholar · View at Scopus
  28. Z. Cai, L. Huang, Z. Guo, and X. Chen, “On the periodic dynamics of a class of time-varying delayed neural networks via differential inclusions,” Neural Networks, vol. 33, pp. 97–113, 2012. View at Publisher · View at Google Scholar
  29. W. Allegretto, D. Papini, and M. Forti, “Common asymptotic behavior of solutions and almost periodicity for discontinuous, delayed, and impulsive neural networks,” IEEE Transactions on Neural Networks, vol. 21, no. 7, pp. 1110–1125, 2010. View at Publisher · View at Google Scholar · View at Scopus
  30. W. Lu and T. Chen, “Almost periodic dynamics of a class of delayed neural networks with discontinuous activations,” Neural Computation, vol. 20, no. 4, pp. 1065–1090, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  31. W. Ding and L. Wang, “N2 almost periodic attractors for Cohen-Grossberg-type BAM neural networks with variable coefficients and distributed delays,” Journal of Mathematical Analysis and Applications, vol. 373, no. 1, pp. 322–342, 2011. View at Publisher · View at Google Scholar · View at MathSciNet
  32. X. Huang and J. Cao, “Almost periodic solution of shunting inhibitory cellular neural networks with time-varying delay,” Physics Letters A, vol. 314, no. 3, pp. 222–231, 2003. View at Publisher · View at Google Scholar · View at MathSciNet
  33. W. Lu and T. Chen, “Global exponential stability of almost periodic solution for a large class of delayed dynamical systems,” Science in China Series A, vol. 48, no. 8, pp. 1015–1026, 2005. View at Publisher · View at Google Scholar · View at MathSciNet
  34. Y. Liu, Z. You, and L. Cao, “On the almost periodic solution of generalized hopfield neural networks with time-varying delays,” Neurocomputing, vol. 69, no. 13–15, pp. 1760–1767, 2006. View at Publisher · View at Google Scholar · View at Scopus
  35. H. Xiang and J. Cao, “Almost periodic solution of Cohen-Grossberg neural networks with bounded and unbounded delays,” Nonlinear Analysis: Real World Applications, vol. 10, no. 4, pp. 2407–2419, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  36. L. Wang, “Existence and global attractivity of almost periodic solutions for delayed high-ordered neural networks,” Neurocomputing, vol. 73, no. 4–6, pp. 802–808, 2010. View at Publisher · View at Google Scholar · View at Scopus
  37. L. Wang, W. Lu, and T. Chen, “Multistability and new attraction basins of almost-periodic solutions of delayed neural networks,” IEEE Transactions on Neural Networks, vol. 20, no. 10, pp. 1581–1593, 2009. View at Publisher · View at Google Scholar · View at Scopus
  38. C. Bai, “Existence and stability of almost periodic solutions of Hopfield neural networks with continuously distributed delays,” Nonlinear Analysis: Theory, Methods & Applications, vol. 71, no. 11, pp. 5850–5859, 2009. View at Publisher · View at Google Scholar · View at MathSciNet
  39. H. Jiang, L. Zhang, and Z. Teng, “Existence and global exponential stability of almost periodic solution for cellular neural networks with variable coefficients and time-varying delays,” IEEE Transactions on Neural Networks, vol. 16, no. 6, pp. 1340–1351, 2005. View at Publisher · View at Google Scholar · View at Scopus
  40. S. Qin, X. Xue, and P. Wang, “Global exponential stability of almost periodic solution of delayed neural networks with discontinuous activations,” Information Sciences, vol. 220, pp. 367–378, 2013. View at Publisher · View at Google Scholar · View at MathSciNet
  41. S. Boyd, L. E. Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory, SIAM, Philadelphia, Pa, USA, 1994.
  42. F. H. Clarke, Oprimization and Non-Smooth Analysis, Wiley, New York, NY, USA, 1983.
  43. A. F. Filippov, Differential Equations with Discontinuous Right-Hand Side, Mathematics and Its Applications, Soviet Series, Kluwer Academic Publisher, Boston, Mass, USA, 1984.
  44. J.-P. Aubin and A. Cellina, Differential Inclusions, vol. 264, Springer, Berlin, Germany, 1984. View at Publisher · View at Google Scholar · View at MathSciNet
  45. B. M. Levitan and V. V. Zhikov, Almost Periodic Functions and Differential Equations, Cambridge University Press, Cambridge, UK, 1982. View at MathSciNet