Abstract

This paper concerns the problem of the globally exponential stability of neural networks with discrete and distributed delays. A novel criterion for the globally exponential stability of neural networks is derived by employing the Lyapunov stability theory, homomorphic mapping theory, and matrix theory. The proposed result improves the previously reported global stability results. Finally, two illustrative numerical examples are given to show the effectiveness of our results.

1. Introduction

The dynamics of neural networks has been widely studied in the past few decades, due to their practical importance and successful applications in many areas such as image processing, combinatorial optimization, signal processing, pattern recognition, and associative memories [14]. In order to design an associative memory by using a neural network, we must choose the appropriate network parameters that allow the designed neural networks to have multiple equilibrium points for a particular input vector depending on the initial states of the neurons. Moreover, in order to solve some classes of optimization problems by employing neural networks, the equilibrium point of designed neural network must be unique and globally stable of the initial conditions. Therefore, the existence, uniqueness, and stability of the equilibrium point are some of the important dynamical properties from the application of the neural networks [1, 46].

It is well known that stability is one of the main properties of neural networks and stability is a crucial feature in the design of neural networks. However, time delays always occur in various neural networks and cause undesirable dynamic network behaviors such as oscillation and instability [3, 79]. A great deal of effort has been devoted to stability analysis of neural networks with various types of time delays such as constant delay, time-varying delay, and distributed delay [14]. In [10], a model for neural networks with constant delay was investigated. Neural networks with time-varying delay have been considered in [24, 79, 1124]. Recently, there has been a growing interest in study of stability analysis for neural networks with discrete and distributed delays [2, 3, 79, 11].

The authors got a global stability condition about neural networks with discrete delays in [1, 4], and distributed delay was not considered. In [2, 3], neural networks with discrete and distributed delays were investigated by employing LMI methods. However, it is difficult to obtain the stability condition. In this paper, by using the Lyapunov stability theory, homomorphic mapping theory, and matrix theory, a novel delay-dependent sufficient condition for global exponential stability of neural networks with discrete and distributed delays is obtained. The process with which we get global stability is simple in this paper. The novel result improves conditions in [3, 79, 11], and it is easy to be verified. Finally, some illustrative numerical examples are given to make a comparison between the proposed result and the previously corresponding results.

2. Preliminaries

The delayed neural networks that we consider are assumed to be governed by the following ordinary differential equations model: Rewritte it in the vector form: where denotes the neuron state vector; is a positive diagonal matrix; and are the interconnection weight matrices, and is the delayed interconnection weight matrix, respectively; denotes the neuron activations; and denote the time delays; , , and denote the constant input vector.

We will assume that each neuron activation function satisfies the following condition:

In this paper, the time delays , are assumed to be nonnegative; that is, , .

Throughout the paper, we use the following notations. Let be a vector of dimension and let be a real matrix. Then we consider the following norms:

Lemma 1 (see [4]). The map is a homeomorphism if satisfies the following conditions:(i) is injective; that is, for all ;(ii) is proper; that is, as .

Lemma 2. For any real numbers and , the following inequality holds: where is any positive constant number.

Lemma 3 (see [24]). For given constant numbers and satisfying , is a nonnegative continuous function on and, as , the following inequality holds: with where is a constant. Then, as , we have where is the unique positive solution of the following equation:

Definition 4 (see [24]). For system (1), it is said to be globally exponentially stable if it has a unique equilibrium point and there exist constants , such that where is a solution of system (1) with initial value , .

3. Existence, Uniqueness, and Stability of Equilibrium Point

This section deals with the existence, uniqueness, and stability of the equilibrium point for system (2). The theorem is stated in the following.

Theorem 5. For the neural network (2), the neural network has a unique equilibrium point, and it is globally asymptotically stable if the following condition holds: where , , , , , .

Proof. The proof can be completed by two steps. Firstly, we prove the existence and uniqueness of the equilibruim point of system (2); it is equivalent to prove that the following mapping (12) is a homeomorphism on . In the second step, the global asymptotical stability is proved.
Step  1. Consider the following mapping associated with system (2):
To prove the existence and uniqueness of the equilibrium point, it is sufficient to show that is a homeomorphism on . We first prove that is injective on . For , Multiplying both sides of (13) with , From (14), we get the following inequalities: where . Let and . Then, (16) can be written as follows: where . Let and . Then, (18) can be written as follows:
Combining (15), (17), and (19), the following inequality is obtained:
Because (11) holds,
Then, for any , which implies that for all . That is, is injective on .
Letting , we obtain it follows that and, therefore, Because is finite, it is obvious that as . So is proper to . From Lemma 2, system (2) has a unique equilibrium point.
Step  2. We show that the condition in Theorem 5 for the existence and uniqueness of the equilibrium point also implies the global asymptotical stability of neural network (2). From the above proof, the system has a unique equilibrium point . Using the transformation , where , . The system (2) can be shifted to the origin and the following form is obtained: where , , and the following inequality holds:
Construct the following positive definite Lyapunov functional: where , , and are defined as follows:
Taking the derivative of along the trajectories of system (26) as follows: we can get the following inequalities: Let and ; (32) can be written as follows: Let , ; (34) can be written as follows:
Combining (31), (33), and (35), the following inequality is obtained:
From (36), we get the following inequality: From the well-known Lyapunov theory, we can conclude that the system (2) is globally asymptotically stable.

In fact, if condition (11) holds, the system (26) or (2) is also globally exponentially stable; that is, the following theorem is true.

Theorem 6. For the neural networks (26) or (2), the neural network is globally exponentially stable, if the following condition holds: where , , , , , .

Proof. Construct the following positive definite Lyapunov functional:
Taking the derivative of along the trajectories of system (26) as follows: one can get the following inequalities: Let and ; (42) can be written as follows: Let and ; (44) can be written as follows:
Utilizing (41), (43), and (45), the following inequality is obtained:
Let
Then, combining (46), the following inequality holds:
Equation (48) can be written as follows: where , .
Because (38) holds, it is obvious that . According to Lemma 3, the following inequality holds:
So the following inequality is obtained: That is, Therefore, we can conclude that the system (26) or (2) is globally exponentially stable.

Remark 7. The process in this paper to get global exponential stability is simple, and the novel stability is less conservative. Also, the new global stability condition can be verified easily. The following examples show the validity.

Remark 8. In order to judge the stability of system (1), and have been calculated in [1, 5, 6], where . In this paper, and are calculated. It should be noted that is smaller than , and is smaller than in [1, 5, 6]. Hence, it may lead to less conservatism.

Remark 9. Condition (11) guarantees the existence and uniqueness of the equilibrium of system (2), as well as the global exponential stability.

Remark 10. In condition (11), the given scalars and satisfied the optimizing procedure and , respectively.

4. Examples

In this section, two numerical examples will be presented to show the validity of the main results derived in this paper.

Example 1. Consider the system (2) with the following parameters [2]: Consider , . Then, we can get , , , , and . Let and : According to Theorem 6, the system (1) is globally exponentially stable.

Example 2. Consider the system (1) with the following parameters [2]: Consider . Then we can get , , , , and . Let and ; the upper bounds of derived by Theorem 5 in our paper and the results in [2, 3, 79, 11] are listed in Table 1; this example shows that the stability condition in this paper gives much less conservative result than in the previous literatures.

5. Conclusions

In this paper, a novel globally exponential stability condition for neural networks with discrete and distributed delays has been proposed. By choosing appropriate Lyapunov functionals, a novel stability condition is derived. Finally, two numerical examples have been given to illustrate the effectiveness of the proposed method.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.