Abstract and Applied Analysis

Volume 2014 (2014), Article ID 263847, 9 pages

http://dx.doi.org/10.1155/2014/263847

## Global -Stability of Complex-Valued Neural Networks with Unbounded Time-Varying Delays

^{1}Department of Mathematics, Chongqing Jiaotong University, Chongqing 400074, China^{2}School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing 210094, China^{3}Department of Mathematics, Huzhou Teachers College, Huzhou 313000, China

Received 4 January 2014; Revised 23 February 2014; Accepted 25 February 2014; Published 3 April 2014

Academic Editor: Derui Ding

Copyright © 2014 Xiaofeng Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The complex-valued neural networks with unbounded time-varying delays are considered. By constructing appropriate Lyapunov-Krasovskii functionals, and employing the free weighting matrix method, several delay-dependent criteria for checking the global -stability of the addressed complex-valued neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples with simulations are given to show the effectiveness and less conservatism of the proposed criteria.

#### 1. Introduction

The nonlinear system is everywhere in real life [1–4], while complex network is one of the kinds of it [5–7]. And neural network is one of the most important of complex networks [8, 9]. In recent years, there have been increasing research interests in analyzing the dynamic behaviors of complex-valued neural networks due to their widespread applications in filtering, imaging, optoelectronics, speech synthesis, computer vision, and so on. For example, see [10–16] and the references therein. In these applications, the stability of complex-valued neural networks plays a very important role.

In real-valued neural networks, the activation function is usually chosen to be smooth and bounded. However, in the complex domain, according to Liouville’s theorem [17], every bounded entire function must be constant. Thus, if the activation function is entire and bounded in the complex domain, then it is a constant. This is not suitable. Therefore, the problem of choosing activation functions is the main challenge for complex-valued neural networks. In [18–20], authors considered three kinds of activation functions, respectively. In [18], authors considered a class of continuous-time recurrent neural networks with two types of activations functions and gave several sufficient conditions for existence, uniqueness, and global stability of the equilibrium point. In [19], a class of generalized discrete complex-valued neural networks were studied. The existence of a unique equilibrium pattern is discussed and a sufficient condition of global exponential stability was given. In [20], authors discuss a class of discrete-time recurrent neural networks with complex-valued linear threshold neurons. The boundedness, global attractivity, and complete stability of such networks were investigated. Some conditions for those properties were also derived. In [21], the discrete-time delayed neural networks with complex-valued linear threshold neurons were investigated, and several criteria on boundedness and global exponential stability were obtained. In [22–24], authors considered different kinds of time delays in complex-valued neural networks. In [22], the boundedness and complete stability of complex-valued neural networks with time delay were studied. Some conditions to guarantee the boundedness and complete stability for the considered neural networks were derived by using the local inhibition and the energy minimization method. In [23], authors investigated the problem of the dynamic behaviors of a class of complex-valued neural networks with mixed time delays. Some sufficient conditions for assuring the existence, uniqueness, and exponential stability of the equilibrium point of the system are derived using the vector Lyapunov function method, homeomorphism mapping lemma, and the matrix theory. In [24], the complex-valued neural networks with both leakage time delay and discrete time delay as well as two types of activation functions on time scales were considered. Several delay-dependent criteria for checking the global stability of the addressed complex-valued neural networks are established in linear matrix inequality by constructing appropriate Lyapunov-Krasovskii functionals and employing the free weighting matrix method. However, the obtained results were based on the assumption that the time delay is bounded. As we know, time delays occur and vary frequently and irregularly in many engineering systems, and sometimes they depend on the histories heavily and may be unbounded [25–27].

How to guarantee the desirable stability if the time delays are unbounded? Recently, the authors proposed two new concepts, power stability and -stability [28, 29], which can be applied to dynamical systems with unbounded time-varying delays. In [28], authors considered the dynamical systems with unbounded time-varying delays. Two approaches were developed to derive sufficient conditions ensuring the existence, uniqueness of the equilibrium, and its global stability. Moreover, under mild conditions, authors proved that the dynamical systems with unbounded time-varying delays were globally power stable. In [29], authors discussed the -stability of dynamical systems with unbounded time-varying delays and explored that under mild conditions; the delayed system with very large time delays has a unique equilibrium, which is -stable globally. In [30], authors investigated the global robust stability for uncertain stochastic neural networks with unbounded time-varying delays and norm-bounded parameter uncertainties. A new concept of global robust -stability in the mean square for neural networks was given first, and then by means of the linear matrix inequality approach, stability criteria were presented. To the best of our knowledge, however, few authors have considered the problem on -stability of complex-valued neural networks with unbounded time-varying delays.

Motivated by the above discussions, the objective of this paper is to study the global -stability of complex-valued neural networks with unbounded time-varying delays by employing a combination of Lyapunov-Krasovskii functionals and the free weighting matrix method. The obtained sufficient conditions do not require the existence of the partial derivatives of the activation functions, and are expressed in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples are given to show the effectiveness and less conservatism of the proposed criteria.

In the literature, the usual method analyzing complex-valued neural networks is to separate it into real part and imaginary part and then to recast it into equivalent real-valued neural networks [18, 24]. However, this separation is not always expressible in an analytical form if the real and imaginary parts of activation functions cannot be separated. In this paper, we do not separate the complex-valued neural networks and directly consider its properties on .

*Notations*. The notations are quite standard. Throughout this paper, let denote the imaginary unit; that is, . , , and denote, respectively, the set of -dimensional complex vectors, real matrices, and complex matrices. The subscripts and denote matrix transposition, matrix complex conjugation and transposition, respectively. For complex vector , let be the module of the vector and the norm of the vector . The notation (resp., ) means that is positive semidefinite (resp., positive definite). and are defined as the largest and the smallest eigenvalue of positive definite matrix , respectively. Sometimes, the arguments of a function or a matrix will be omitted in the analysis when no confusion can arise.

#### 2. Problems Formulation and Preliminaries

In this section, we will recall some definitions and lemmas which will be used in the proofs of our main results.

In this paper, we consider the following complex-valued neural networks with time-varying delays: for , where is the state vector of the neural networks with neurons at time . is the self-feedback connection weight matrix with (). and are the connection weight matrix and the delayed connection weight matrix, respectively. denotes the neuron activation at time . is the external input vector.

The initial condition associated with neural network (1) is given by where is continuous in .

As we know, the activation functions play a very important role in the study of global stability of neural networks. However, in the complex domain, the activation functions cannot be both entire and bounded. In this paper, we will consider the following two types of activation functions.

(H1) Let . can be expressed by separating into its real and imaginary part as for , where and . Suppose there exist positive constants , , , and such that for any , . Moreover, we define , where , , and .

(H2) The real and imaginary parts of cannot be separated, but is bounded and satisfies the following condition: for any , , where is a constant. Moreover, we define .

For the time-varying delays , we give the following assumption.

(H3) is nonnegative and differentiable and satisfies , where is a nonnegative constant.

As usual, a vector is said to be an equilibrium point of neural network (1) if it satisfies

Throughout the paper, the equilibrium of neural network (1) is assumed to exist. For notational convenience, we will always shift an intended equilibrium point of neural network (1) to the origin by letting . It is easy to transform neural network (1) into the following form: for , where and .

Next we introduce some definitions and lemmas to be used in the stability analysis.

*Definition 1 1. *Suppose that is a positive continuous function and satisfies as . If there exist scalars and such that
then neural network (1) is said to be -stable.

*Definition 2. *If there exist scalars , , and such that
then neural network (1) is said to be exponentially stable.

*Definition 3 3. *If there exist scalars , , and such that
then neural network (1) is said to be power stable.

*Remark 4 4. *It is obvious that both and approaches , when approaches . Therefore, neural network (1) is -stable if it is exponentially stable or power stable. In other words, exponential stability and power stability are two special cases of -stability.

Lemma 5 (see [22]). *Given a Hermitian matrix , then is equivalent to
**
where and .*

#### 3. Main Results

In this section, we are going to give several criteria for checking the global -stability of the complex-valued neural networks (1).

Theorem 6 6. *Assume that assumptions (H1) and (H3) hold. Neural network (1) is globally -stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , a positive differential function defined on , and three constants , , and such that
**
and the following LMI holds
**
where
**
in which
*

*Proof. *Consider the following Lyapunov-Krasovskii functional candidates:
Calculate the derivative of along the trajectories of neural network (7). We obtain
In deriving inequality (17), we have made use of condition (12) and assumption (H3).

From assumption (H1), we get
for all , where , . Hence,
where , , , . Let . It follows from (19) that
for . Thus

Also, we can get that

From (7), we have that
It follows from (17), (21), (22), and (23) that
where and
with , , , . From (14), we have that . It follows from (13) and Lemma 5 that . Thus, we get from (24) that
which means is monotonically nonincreasing for . So we have
From the definition of in (16), we obtain that
where . It implies that
where . The proof is completed.

*Theorem 7 7. Assume that assumptions (H2) and (H3) hold. Neural network (1) is globally -stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , a positive differential function defined on , and three constants , , and such that conditions (12), (13), and (14) in Theorem 6 are satisfied, where and .*

*Proof. *From assumption (H2), we get
for . Let . It follows from (30) that
for . Hence
Also, we can get that
Consider the Lyapunov-Krasovskii functional (16). From (17), (23), (32), and (33), we can get that
where and
with , , , . From the proof of Theorem 6, we know that neural network (1) is -stable. The proof is completed.

*Corollary 8 8. Assume that assumptions (H1) and (H3) hold and . Neural network (1) is globally exponentially stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and a constant such that conditions (13) and (14) in Theorem 6 are satisfied, where , , , .*

*Proof. *Let (); then , . Take , , and . It is obvious that condition (12) in Theorem 6 is satisfied. The proof is completed.

*Corollary 9. Assume that assumptions (H1) and (H3) hold and . Neural network (1) is globally power stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and two constants and such that conditions (13) and (14) in Theorem 6 are satisfied, where , , , .*

*Proof. *Let (); then , . Take , . It is obvious that condition (12) in Theorem 6 is satisfied. The proof is completed.

*It is similar to the proof of Corollaries 8 and 9, and we have the following results.*

*Corollary 10 10. Assume that assumptions (H2) and (H3) hold and . Neural network (1) is globally exponentially stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and a constant such that conditions (13) and (14) in Theorem 7 are satisfied, where , , , .*

*Corollary 11 11. Assume that assumptions (H2) and (H3) hold and (). Neural network (1) is globally power stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and two constants and such that conditions (13) and (14) in Theorem 7 are satisfied, where , , , .*

*Remark 12. *In [18, 23], the authors studied the global stability of complex-valued neural networks with time-varying delays, and the activation function was supposed to be
where and , are real-valued for all , and the partial derivatives of , with respect to and were required to exist and be continuous. In this paper, both the real parts and the imaginary parts of the activation functions are no longer assumed to be derivable.

*Remark 13 13. *In [21], the authors investigated boundedness and stability for discrete-time complex-valued neural networks with constant delay, and the activation function was supposed to be
Also, the authors gave a complex-valued LMI criterion for checking stability of the considered neural networks. But a feasible way to solve the given complex-valued LMI is not provided. In this paper, the criteria for checking stability of the complex-valued neural networks are established in the real LMI, which can be checked numerically using the effective LMI toolbox in MATLAB.

*4. Examples*

*Example 1. *Consider a two-neuron complex-valued neural network with constant delay
where

It can be verified that the activation functions satisfy assumption (H1), the time-varying delay satisfies assumption (H3), and , , , , , . There exists a constant , and by employing MATLAB LMI toolbox, we can find the solutions to the LMI under conditions (13) and (14) in Corollary 8 as follows:
Then all conditions in Corollary 8 are satisfied. Therefore the neural network (38) is globally exponentially stable. Figures 1 and 2 depict the real and imaginary parts of states of the considered neural network (38), where initial condition is .

*Example 2. *Consider a two-neuron complex-valued neural network with unbounded time-varying delays
where

It can be verified that the activation functions satisfy Condition (H2), the time-varying delay satisfies assumption (H3), and , , , , , . There exist two constants and , and by employing MATLAB LMI toolbox, we can find the solutions to the LMI under conditions (13) and (14) in Corollary 11 as follows:
Then all conditions in Corollary 11 are satisfied. Thus the neural network (41) is globally power stable. Figures 3 and 4 depict the real and imaginary parts of states of the considered network (41), where initial condition is , .

*5. Conclusion*

*In this paper, the global -stability of complex-valued neural networks with unbounded time delays has been investigated by employing a combination of Lyapunov-Krasovskii functionals and the free weighting matrix method. Several sufficient conditions for checking the global -stability of the considered complex-valued neural networks have been given in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. As direct applications, we can get the exponential stability and power stability, with respect to different time-varying delays. Two examples have been provided to demonstrate the effectiveness and less conservatism of the proposed criteria.*

*Remark 14. *Note that the activation functions in this paper are continuous. Recently, neural networks with discontinuous activations have received more and more attention [31–34]. Based on the results of this paper, we will consider -stability of complex-valued neural networks with discontinuous activations in future.

*Conflict of Interests*

*The authors declare that they have no conflict of interests regarding the publication of this paper.*

*Acknowledgments*

*This work was supported by the National Natural Science Foundation of China under Grants 61273021 and 11172247 and in part by the Natural Science Foundation Project of CQ cstc2013jjB40008.*

*References*

- H. Dong, Z. Wang, and H. Gao, “Distributed ${H}_{\infty}$ filtering for a class of Markovian jump nonlinear time-delay systems over lossy sensor networks,”
*IEEE Transactions on Industrial Electronics*, vol. 60, no. 10, pp. 4665–4672, 2013. View at Google Scholar - J. Hu, Z. Wang, B. Shen, and H. Gao, “Quantised recursive filtering for a class of nonlinear systems with multiplicative noises and missing measurements,”
*International Journal of Control*, vol. 86, no. 4, pp. 650–663, 2013. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. Fang, K. Yan, and K. Li, “Impulsive synchronization of a class of chaotic systems,”
*Systems Science and Control Engineering*, vol. 2, no. 1, pp. 55–60, 2014. View at Google Scholar - M. Darouach and M. Chadli, “Admissibility and control of switched discrete-time singular systems,”
*Systems Science and Control Engineering*, vol. 1, no. 1, pp. 43–51, 2013. View at Google Scholar - B. Shen, Z. Wang, D. Ding, and H. Shu, “${H}_{\infty}$ state estimation for complex networks with uncertain inner coupling and incomplete measurements,”
*IEEE Transactions on Neural Networks and Learning Systems*, vol. 24, no. 12, pp. 2027–2037, 2013. View at Google Scholar - B. Shen, Z. Wang, and X. Liu, “Sampled-data synchronization control of dynamical networks with stochastic sampling,”
*IEEE Transactions on Automatic Control*, vol. 57, no. 10, pp. 2644–2650, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - J. Liang, Z. Wang, Y. Liu, and X. Liu, “State estimation for two-dimensional complex networks with randomly occurring nonlinearities and randomly varying sensor delays,”
*International Journal of Robust and Nonlinear Control*, vol. 24, no. 1, pp. 18–38, 2014. View at Google Scholar - X. Kan, Z. Wang, and H. Shu, “State estimation for discrete-time delayed neural networks with fractional uncertainties and sensor saturations,”
*Neurocomputing*, vol. 117, pp. 64–71, 2013. View at Google Scholar - Z. Wang, B. Zineddin, J. Liang et al., “A novel neural network approach to cDNA microarray image segmentation,”
*Computer Methods and Programs in Biomedicine*, vol. 111, no. 1, pp. 189–198, 2013. View at Google Scholar - A. Hirose,
*Complex-Valued Neural Networks*, Springer; Heidelberg GmbH and Co. K, Berlin, Germany, 2012. - G. Tanaka and K. Aihara, “Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction,”
*IEEE Transactions on Neural Networks*, vol. 20, no. 9, pp. 1463–1473, 2009. View at Publisher · View at Google Scholar · View at Scopus - S. Chen, L. Hanzo, and S. Tan, “Symmetric complex-valued RBF receiver for multiple-antenna-aided wireless systems,”
*IEEE transactions on Neural Networks*, vol. 19, no. 9, pp. 1659–1665, 2008. View at Google Scholar · View at Scopus - I. Cha and S. A. Kassam, “Channel equalization using adaptive complex radial basis function networks,”
*IEEE Journal on Selected Areas in Communications*, vol. 13, no. 1, pp. 122–131, 1995. View at Publisher · View at Google Scholar · View at Scopus - T. Nitta, “Orthogonality of decision boundaries in complex-valued neural networks,”
*Neural Computation*, vol. 16, no. 1, pp. 73–97, 2004. View at Publisher · View at Google Scholar · View at Scopus - M. Faijul Amin and K. Murase, “Single-layered complex-valued neural network for real-valued classification problems,”
*Neurocomputing*, vol. 72, no. 4–6, pp. 945–955, 2009. View at Publisher · View at Google Scholar · View at Scopus - B. K. Tripathi and P. K. Kalra, “On efficient learning machine with root-power mean neuron in complex domain,”
*IEEE Transactions on Neural Networks*, vol. 22, no. 5, pp. 727–738, 2011. View at Publisher · View at Google Scholar · View at Scopus - J. H. Mathews and R. W. Howell,
*Complex Analysis for Mathematics and Engineering*, Jones and Bartlett Publications, 3rd edition, 1997. - J. Hu and J. Wang, “Global stability of complex-valued recurrent neural networks with time-delays,”
*IEEE Transactions on Neural Networks*, vol. 23, no. 6, pp. 853–865, 2012. View at Google Scholar - V. S. H. Rao and G. R. Murthy, “Global dynamics of a class of complex valued neural networks,”
*International Journal of Neural Systems*, vol. 18, no. 2, pp. 165–171, 2008. View at Publisher · View at Google Scholar · View at Scopus - W. Zhou and J. M. Zurada, “Discrete-time recurrent neural networks with complex-valued linear threshold neurons,”
*IEEE Transactions on Circuits and Systems II: Express Briefs*, vol. 56, no. 8, pp. 669–673, 2009. View at Publisher · View at Google Scholar · View at Scopus - C. Duan and Q. Song, “Boundedness and stability for discrete-time delayed neural network with complex-valued linear threshold neurons,”
*Discrete Dynamics in Nature and Society*, vol. 2010, Article ID 368379, 19 pages, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - B. Zou and Q. Song, “Boundedness and complete stability of complex-valued neural networks with time delay,”
*IEEE Transactions on Neural Networks and Learning Systems*, vol. 24, no. 8, pp. 1227–1238, 2013. View at Google Scholar - X. Xu, J. Zhang, and J. Shi, “Exponential stability of complex-valued neural networks with mixed delays,”
*Neurocomputing*, vol. 128, pp. 483–490, 2014. View at Google Scholar - X. Chen and Q. Song, “Global stability of complex-valued neural networks with both leakage time delay and discrete time delay on time scales,”
*Neurocomputing*, vol. 121, pp. 254–264, 2013. View at Google Scholar - S.-I. Niculescu,
*Delay Effects on Stability: A Robust Control Approach*, Springer, London, UK, 2001. View at MathSciNet - V. B. Kolmanovskiĭ and V. R. Nosov,
*Stability of Functional-Differential Equations*, Academic Press, London, UK, 1986. View at MathSciNet - Y. Liu, Z. Wang, J. Liang, and X. Liu, “Synchronization of coupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delays,”
*IEEE Transactions on Cybernetics*, vol. 43, no. 1, pp. 102–114, 2013. View at Google Scholar - T. Chen and L. Wang, “Power-rate global stability of dynamical systems with unbounded time-varying delays,”
*IEEE Transactions on Circuits and Systems II: Express Briefs*, vol. 54, no. 8, pp. 705–709, 2007. View at Publisher · View at Google Scholar · View at Scopus - T. Chen and L. Wang, “Global
*μ*-stability of delayed neural networks with unbounded time-varying delays,”*IEEE Transactions on Neural Networks*, vol. 18, no. 6, pp. 1836–1840, 2007. View at Publisher · View at Google Scholar · View at Scopus - X. Liu and T. Chen, “Robust $\mu $-stability for uncertain stochastic neural networks with unbounded time-varying delays,”
*Physica A: Statistical Mechanics and Its Applications*, vol. 387, no. 12, pp. 2952–2962, 2008. View at Publisher · View at Google Scholar · View at MathSciNet - X. Liu and J. Cao, “On periodic solutions of neural networks via differential inclusions,”
*Neural Networks*, vol. 22, no. 4, pp. 329–334, 2009. View at Publisher · View at Google Scholar · View at Scopus - X. Liu and J. Cao, “Robust state estimation for neural networks with discontinuous activations,”
*IEEE Transactions on Systems, Man, and Cybernetics B: Cybernetics*, vol. 40, no. 6, pp. 1425–1437, 2010. View at Google Scholar · View at Scopus - X. Liu, T. Chen, J. Cao, and W. Lu, “Dissipativity and quasi-synchronization for neural networks with discontinuous activations and parameter mismatches,”
*Neural Networks*, vol. 24, no. 10, pp. 1013–1021, 2011. View at Publisher · View at Google Scholar · View at Scopus - X. Liu, J. Cao, and W. Yu, “Filippov systems and quasi-synchronization control for switched networks,”
*Chaos*, vol. 22, no. 3, Article ID 033110, 2012. View at Google Scholar

*
*