Periodic Solutions and Exponential Stability of a Class of Neural Networks with Time-Varying Delays
Employing fixed point theorem, we make a further investigation of a class of neural networks with delays in this paper. A family of sufficient conditions is given for checking global exponential stability. These results have important leading significance in the design and applications of globally stable neural networks with delays. Our results extend and improve some earlier publications.
The stability of dynamical neural networks with time delay which have been used in many applications such as optimization, control, and image processing has received much attention recently (see, e.g., [1–15]). Particularly, the authors [3, 8, 9, 14, 16] have studied the stability of neural networks with time-varying delays.
As pointed out in , Global dissipativity is also an important concept in dynamical neural networks. The concept of global dissipativity in dynamical systems is a more general concept, and it has found applications in areas such as stability theory, chaos and synchronization theory, system norm estimation, and robust control . Global dissipativity of several classes of neural networks was discussed, and some sufficient conditions for the global dissipativity of neural networks with constant delays are derived in .
In this paper, without assuming the boundedness, monotonicity, and differentiability of activation functions, we consider the following delay differential equations: where denotes the number of the neurons in the network, is the state of the th neuron at time , denote the activation functions of the th neuron at time , and the kernels are piece continuous functions with for . Moreover, we consider model (1.1) with , , , , , and satisfying the following assumptions:(A1) the time delays are periodic functions with a common period for ;(A2), are periodic functions with a common period and
The organization of this paper is as follows. In Section 2, problem formulation and preliminaries are given. In Section 3, some new results are given to ascertain the global robust dissipativity of the neural networks with time-varying delays. Section 4 gives an example to illustrate the effectiveness of our results.
2. Preliminaries and Lemmas
For the sake of convenience, two of the standing assumptions are formulated below as follows.(A3) for all , where , are nonnegative constants.(A4) There exist nonnegative constants , such that for any
The initial conditions associated with system (1.1) are of the form in which is continuous for
For continuous functions defined on , we set If is an equilibrium of system (1.1), then we denote
Definition 2.1. The equilibrium is said to be globally exponentially stable, if there exist constants and such that for any solution of (1.1), we have for where is called to be globally exponentially convergent rate.
Lemma 2.2 (). If for matrix then where denotes the identity matrix of size
3. Periodic Solutions and Exponential Stability
We will use the coincidence degree theory to obtain the existence of a -periodic solution to systems (1.1). For the sake of convenience, we briefly summarize the theory as follows.
Let and be normed spaces, and let be a linear mapping and be a continuous mapping. The mapping will be called a Fredholm mapping of index zero if and is closed in . If is a Fredholm mapping of index zero, then there exist continuous projectors and such that and It follows that is invertible. We denote the inverse of this map by If is a bounded open subset of the mapping is called -compact on if is bounded and is compact. Because is isomorphic to there exists an isomorphism
Let be open and bounded, and that is, is a regular value of Here, , the critical set of and is the Jacobian of at Then the degree is defined by with the agreement that the above sum is zero if For more details about the degree theory, we refer to the book of Deimling .
Lemma 3.1 (continuation theorem [19, page 40]). Let be a Fredholm mapping of index zero, and let be -compact on . Suppose that (a)for each every solution of is such that ;(b) for each and Then the equation has at least one solution lying in .
For the simplicity of presentation, in the remaining part of this paper, for a continuous function , we denote
Theorem 3.2. Let (A1)–(A3) hold, and . If , then system (1.1) has at least a -periodic solution.
Proof. Take , and denote
Equipped with the norms , both and are Banach spaces. Denote
then, for any because of the periodicity, it is easy to check that
Here, for any we identify it as the constant function in or with the value vector Then system (1.1) can be reduced to the operator equation It is easy to see that
and , are continuous projectors such that
It follows that is a Fredholm mapping of index zero. Furthermore, the generalized inverse (to ) is given by
Clearly, and are continuous. For any bounded open subset , is obviously bounded. Moreover, applying the ArzelaCAscoli theorem, one can easily show that is compact. Therefore, is -compact on with any bounded open subset Since we take the isomorphism of onto to be the identity mapping.
Now, we reach the point to search for an appropriate open bounded set for the application of the continuation theorem corresponding to the operator equation , and we have Assume that is a solution of system (1.1) for some Integrating both sides of (3.13) over the interval we obtain Then Noting that we get It follows that Note that each is continuously differentiable for and it is certain that there exists such that Set In view of and Lemma 2.2, we have where is given by Let Then, for , we have where denotes the right derivative. Clearly, , are independent of . Then there are no and such that When , is a constant vector in with , Note that ; when it must be We claim that On the contrary, suppose that there exists some such that that is, Then, we have which is a contradiction. Therefore, Consider the homotopy defined by Note that ; if then, as before, we have Hence It follows from the property of invariance under a homotopy that Thus, we have shown that satisfies all the assumptions of Lemma 3.1. Hence, has at least one -periodic solution on . This completes the proof.
When , (1.1) turns into the following system:
Theorem 3.4. Let (A1), (A2), and (A4) hold, , and . If , and that then system (1.1) has exactly one -periodic solution. Moreover, it is globally exponentially stable.
Proof. Let with the supnorm , As usual, if and then for we define by , . From (A4), we can get , Hence, all the hypotheses in Theorem 3.2 hold with , Thus, system (1.1) has at least one -periodic solution, say Let be an arbitrary solution of system (1.1). For a direct calculation of the right derivative of along the solutions of system (1.1) leads to Let . Then (3.33) can be transformed into Thus, for we have It follows that Thus, for any and we have Therefore, It follows that By Gronwall's inequality, we obtain Without loss of generality, we let For , denotes the largest integer less than or equal to . Noting , and , we get where and are positive constants. From (3.41), it is obvious that the periodic solution is global exponentially stable, and this completes the proof of Theorem 3.4.
Remark 3.6. To the best of our knowledge, few authors have considered the existence of periodic solution and global exponential stability for model (1.1) with coefficients and delays all periodically varying in time. We only find model () in [20, 21]; however, it is assumed in  that are constants and in  that , , are continuous -periodic functions, and are positive constants. Especially, the authors of  suppose that are continuously differentiable -periodic functions and , clearly, which implies that are also constants. Obviously, our model is more general. Furthermore, in [20, 21] , are assumed to be strictly monotone, and the explicit presence of the maximum value of the coefficients functions in Theorems 3.2 and 3.4 (see [20, 21]) may impose a very strict constraint on the model (e.g., when some of the maximum value of the coefficients functions are very large). Therefore, our results are more convenient when designing a cellular neural network.
4. An Example
In this section, an example is used to demonstrate that the method presented in this paper is effective.
Example 4.1. Consider the following two state neural networks: where, all , , , , are -periodic continuous functions. The activation function , . , , ; ; ; ; ; , . Clearly, satisfies the hypothesis with , By some simple calculations, we have Therefore, by Theorem 3.4, the system (1.1) has an exponentially stable -periodic solution.
The first author was partially supported financially by the National Natural Science Foundation of China (10801088).
J. P. LaSalle, The Stability of Dynamical Systems, SIAM, Philadelphia, Pa, USA, 1976.View at: MathSciNet
K. Deimling, Nonlinear Functional Analysis, Springer, Berlin, Germany, 1985.View at: MathSciNet
R. E. Gaines and J. L. Mawhin, Coincidence Degree, and Nonlinear Differential Equations, vol. 568 of Lecture Notes in Mathematics, Springer, Berlin, Germany, 1977.View at: MathSciNet