Abstract

The complex-valued neural networks with unbounded time-varying delays are considered. By constructing appropriate Lyapunov-Krasovskii functionals, and employing the free weighting matrix method, several delay-dependent criteria for checking the global -stability of the addressed complex-valued neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples with simulations are given to show the effectiveness and less conservatism of the proposed criteria.

1. Introduction

The nonlinear system is everywhere in real life [14], while complex network is one of the kinds of it [57]. And neural network is one of the most important of complex networks [8, 9]. In recent years, there have been increasing research interests in analyzing the dynamic behaviors of complex-valued neural networks due to their widespread applications in filtering, imaging, optoelectronics, speech synthesis, computer vision, and so on. For example, see [1016] and the references therein. In these applications, the stability of complex-valued neural networks plays a very important role.

In real-valued neural networks, the activation function is usually chosen to be smooth and bounded. However, in the complex domain, according to Liouville’s theorem [17], every bounded entire function must be constant. Thus, if the activation function is entire and bounded in the complex domain, then it is a constant. This is not suitable. Therefore, the problem of choosing activation functions is the main challenge for complex-valued neural networks. In [1820], authors considered three kinds of activation functions, respectively. In [18], authors considered a class of continuous-time recurrent neural networks with two types of activations functions and gave several sufficient conditions for existence, uniqueness, and global stability of the equilibrium point. In [19], a class of generalized discrete complex-valued neural networks were studied. The existence of a unique equilibrium pattern is discussed and a sufficient condition of global exponential stability was given. In [20], authors discuss a class of discrete-time recurrent neural networks with complex-valued linear threshold neurons. The boundedness, global attractivity, and complete stability of such networks were investigated. Some conditions for those properties were also derived. In [21], the discrete-time delayed neural networks with complex-valued linear threshold neurons were investigated, and several criteria on boundedness and global exponential stability were obtained. In [2224], authors considered different kinds of time delays in complex-valued neural networks. In [22], the boundedness and complete stability of complex-valued neural networks with time delay were studied. Some conditions to guarantee the boundedness and complete stability for the considered neural networks were derived by using the local inhibition and the energy minimization method. In [23], authors investigated the problem of the dynamic behaviors of a class of complex-valued neural networks with mixed time delays. Some sufficient conditions for assuring the existence, uniqueness, and exponential stability of the equilibrium point of the system are derived using the vector Lyapunov function method, homeomorphism mapping lemma, and the matrix theory. In [24], the complex-valued neural networks with both leakage time delay and discrete time delay as well as two types of activation functions on time scales were considered. Several delay-dependent criteria for checking the global stability of the addressed complex-valued neural networks are established in linear matrix inequality by constructing appropriate Lyapunov-Krasovskii functionals and employing the free weighting matrix method. However, the obtained results were based on the assumption that the time delay is bounded. As we know, time delays occur and vary frequently and irregularly in many engineering systems, and sometimes they depend on the histories heavily and may be unbounded [2527].

How to guarantee the desirable stability if the time delays are unbounded? Recently, the authors proposed two new concepts, power stability and -stability [28, 29], which can be applied to dynamical systems with unbounded time-varying delays. In [28], authors considered the dynamical systems with unbounded time-varying delays. Two approaches were developed to derive sufficient conditions ensuring the existence, uniqueness of the equilibrium, and its global stability. Moreover, under mild conditions, authors proved that the dynamical systems with unbounded time-varying delays were globally power stable. In [29], authors discussed the -stability of dynamical systems with unbounded time-varying delays and explored that under mild conditions; the delayed system with very large time delays has a unique equilibrium, which is -stable globally. In [30], authors investigated the global robust stability for uncertain stochastic neural networks with unbounded time-varying delays and norm-bounded parameter uncertainties. A new concept of global robust -stability in the mean square for neural networks was given first, and then by means of the linear matrix inequality approach, stability criteria were presented. To the best of our knowledge, however, few authors have considered the problem on -stability of complex-valued neural networks with unbounded time-varying delays.

Motivated by the above discussions, the objective of this paper is to study the global -stability of complex-valued neural networks with unbounded time-varying delays by employing a combination of Lyapunov-Krasovskii functionals and the free weighting matrix method. The obtained sufficient conditions do not require the existence of the partial derivatives of the activation functions, and are expressed in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples are given to show the effectiveness and less conservatism of the proposed criteria.

In the literature, the usual method analyzing complex-valued neural networks is to separate it into real part and imaginary part and then to recast it into equivalent real-valued neural networks [18, 24]. However, this separation is not always expressible in an analytical form if the real and imaginary parts of activation functions cannot be separated. In this paper, we do not separate the complex-valued neural networks and directly consider its properties on .

Notations. The notations are quite standard. Throughout this paper, let denote the imaginary unit; that is, . , , and denote, respectively, the set of -dimensional complex vectors, real matrices, and complex matrices. The subscripts and denote matrix transposition, matrix complex conjugation and transposition, respectively. For complex vector , let be the module of the vector and the norm of the vector . The notation (resp., ) means that is positive semidefinite (resp., positive definite). and are defined as the largest and the smallest eigenvalue of positive definite matrix , respectively. Sometimes, the arguments of a function or a matrix will be omitted in the analysis when no confusion can arise.

2. Problems Formulation and Preliminaries

In this section, we will recall some definitions and lemmas which will be used in the proofs of our main results.

In this paper, we consider the following complex-valued neural networks with time-varying delays: for , where is the state vector of the neural networks with neurons at time . is the self-feedback connection weight matrix with (). and are the connection weight matrix and the delayed connection weight matrix, respectively. denotes the neuron activation at time . is the external input vector.

The initial condition associated with neural network (1) is given by where is continuous in .

As we know, the activation functions play a very important role in the study of global stability of neural networks. However, in the complex domain, the activation functions cannot be both entire and bounded. In this paper, we will consider the following two types of activation functions.

(H1) Let . can be expressed by separating into its real and imaginary part as for , where and . Suppose there exist positive constants , , , and such that for any , . Moreover, we define , where , , and .

(H2) The real and imaginary parts of cannot be separated, but is bounded and satisfies the following condition: for any , , where is a constant. Moreover, we define .

For the time-varying delays , we give the following assumption.

(H3) is nonnegative and differentiable and satisfies , where is a nonnegative constant.

As usual, a vector is said to be an equilibrium point of neural network (1) if it satisfies

Throughout the paper, the equilibrium of neural network (1) is assumed to exist. For notational convenience, we will always shift an intended equilibrium point of neural network (1) to the origin by letting . It is easy to transform neural network (1) into the following form: for , where and .

Next we introduce some definitions and lemmas to be used in the stability analysis.

Definition 1 1. Suppose that is a positive continuous function and satisfies as . If there exist scalars and such that then neural network (1) is said to be -stable.

Definition 2. If there exist scalars , , and such that then neural network (1) is said to be exponentially stable.

Definition 3 3. If there exist scalars , , and such that then neural network (1) is said to be power stable.

Remark 4 4. It is obvious that both and approaches , when approaches . Therefore, neural network (1) is -stable if it is exponentially stable or power stable. In other words, exponential stability and power stability are two special cases of -stability.

Lemma 5 (see [22]). Given a Hermitian matrix , then is equivalent to where and .

3. Main Results

In this section, we are going to give several criteria for checking the global -stability of the complex-valued neural networks (1).

Theorem 6 6. Assume that assumptions (H1) and (H3) hold. Neural network (1) is globally -stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , a positive differential function defined on , and three constants , , and such that and the following LMI holds where in which

Proof. Consider the following Lyapunov-Krasovskii functional candidates: Calculate the derivative of along the trajectories of neural network (7). We obtain In deriving inequality (17), we have made use of condition (12) and assumption (H3).
From assumption (H1), we get for all , where , . Hence, where , , , . Let . It follows from (19) that for . Thus
Also, we can get that
From (7), we have that It follows from (17), (21), (22), and (23) that where and with ,  , ,  . From (14), we have that . It follows from (13) and Lemma 5 that . Thus, we get from (24) that which means is monotonically nonincreasing for . So we have From the definition of in (16), we obtain that where . It implies that where . The proof is completed.

Theorem 7 7. Assume that assumptions (H2) and (H3) hold. Neural network (1) is globally -stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , a positive differential function defined on , and three constants , , and such that conditions (12), (13), and (14) in Theorem 6 are satisfied, where and .

Proof. From assumption (H2), we get for . Let . It follows from (30) that for . Hence Also, we can get that Consider the Lyapunov-Krasovskii functional (16). From (17), (23), (32), and (33), we can get that where and with ,  , ,  . From the proof of Theorem 6, we know that neural network (1) is -stable. The proof is completed.

Corollary 8 8. Assume that assumptions (H1) and (H3) hold and . Neural network (1) is globally exponentially stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and a constant such that conditions (13) and (14) in Theorem 6 are satisfied, where ,  ,  ,  .

Proof. Let (); then , . Take , , and . It is obvious that condition (12) in Theorem 6 is satisfied. The proof is completed.

Corollary 9. Assume that assumptions (H1) and (H3) hold and . Neural network (1) is globally power stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and two constants and such that conditions (13) and (14) in Theorem 6 are satisfied, where ,  , , .

Proof. Let (); then , . Take , . It is obvious that condition (12) in Theorem 6 is satisfied. The proof is completed.

It is similar to the proof of Corollaries 8 and 9, and we have the following results.

Corollary 10 10. Assume that assumptions (H2) and (H3) hold and . Neural network (1) is globally exponentially stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and a constant such that conditions (13) and (14) in Theorem 7 are satisfied, where , , , .

Corollary 11 11. Assume that assumptions (H2) and (H3) hold and (). Neural network (1) is globally power stable, if there exist two positive definite Hermitian matrices and , two real positive diagonal matrices and , two complex matrices , , and two constants and such that conditions (13) and (14) in Theorem 7 are satisfied, where , ,  , .

Remark 12. In [18, 23], the authors studied the global stability of complex-valued neural networks with time-varying delays, and the activation function was supposed to be where and , are real-valued for all , and the partial derivatives of , with respect to and were required to exist and be continuous. In this paper, both the real parts and the imaginary parts of the activation functions are no longer assumed to be derivable.

Remark 13 13. In [21], the authors investigated boundedness and stability for discrete-time complex-valued neural networks with constant delay, and the activation function was supposed to be Also, the authors gave a complex-valued LMI criterion for checking stability of the considered neural networks. But a feasible way to solve the given complex-valued LMI is not provided. In this paper, the criteria for checking stability of the complex-valued neural networks are established in the real LMI, which can be checked numerically using the effective LMI toolbox in MATLAB.

4. Examples

Example 1. Consider a two-neuron complex-valued neural network with constant delay where
It can be verified that the activation functions satisfy assumption (H1), the time-varying delay satisfies assumption (H3), and , , , , , . There exists a constant , and by employing MATLAB LMI toolbox, we can find the solutions to the LMI under conditions (13) and (14) in Corollary 8 as follows: Then all conditions in Corollary 8 are satisfied. Therefore the neural network (38) is globally exponentially stable. Figures 1 and 2 depict the real and imaginary parts of states of the considered neural network (38), where initial condition is .

Example 2. Consider a two-neuron complex-valued neural network with unbounded time-varying delays where
It can be verified that the activation functions satisfy Condition (H2), the time-varying delay satisfies assumption (H3), and , , , , , . There exist two constants and , and by employing MATLAB LMI toolbox, we can find the solutions to the LMI under conditions (13) and (14) in Corollary 11 as follows: Then all conditions in Corollary 11 are satisfied. Thus the neural network (41) is globally power stable. Figures 3 and 4 depict the real and imaginary parts of states of the considered network (41), where initial condition is , .

5. Conclusion

In this paper, the global -stability of complex-valued neural networks with unbounded time delays has been investigated by employing a combination of Lyapunov-Krasovskii functionals and the free weighting matrix method. Several sufficient conditions for checking the global -stability of the considered complex-valued neural networks have been given in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. As direct applications, we can get the exponential stability and power stability, with respect to different time-varying delays. Two examples have been provided to demonstrate the effectiveness and less conservatism of the proposed criteria.

Remark 14. Note that the activation functions in this paper are continuous. Recently, neural networks with discontinuous activations have received more and more attention [3134]. Based on the results of this paper, we will consider -stability of complex-valued neural networks with discontinuous activations in future.

Conflict of Interests

The authors declare that they have no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China under Grants 61273021 and 11172247 and in part by the Natural Science Foundation Project of CQ cstc2013jjB40008.