Abstract

A discrete-time multidirectional associative memory neural networks model with varying time delays is formulated by employing the semidiscretization method. A sufficient condition for the existence of an equilibrium point is given. By calculating difference and using inequality technique, a sufficient condition for the global exponential stability of the equilibrium point is obtained. The results are helpful to design global exponentially stable multidirectional associative memory neural networks. An example is given to illustrate the effectiveness of the results.

1. Introduction

The multidirectional associative memory (MAM) neural networks were first proposed by the Japanese scholar M. Hagiwara in 1990 [1]. The MAM neural networks have found wide applications in areas of speech recognition, image denoising, pattern recognition, and other more complex intelligent information processing. So they have attracted the attention of many researchers [27]. In [5], we proposed a mathematical model of multidirectional associative memory neural network with varying time delays as follows, which consists of the fields, and there are neurons in the field (: where , , denote the membrane voltage of the th neuron in the field at time , denote the decay rate of the th neuron in the field , is a neuronal activation function of the th neuron in the field , is the connection weight from the th neuron in the field to the th neuron in the field , is the external input of the th neuron in the field , and is the time delay of the synapse from the neuron in the field to the th neuron in the field at time . We studied the existence of an equilibrium point by using Brouwer fixed-point theorem and obtained a sufficient condition for the global exponential stability of an equilibrium point by constructing a suitable Lyapunov function.

However, discrete-time neural networks are more important than their continuous-time counterparts in applications of neural networks. One can refer to [810] in order to find out the research significance of discrete-time neural networks. To the best of our knowledge, few studies have considered the stability of discrete-time MAM neural networks.

In this paper, we first formulate a discrete-time analogues of the continuous-time network (1.1), and in a next study the existence and the global exponential stability of an equilibrium point for the discrete-time MAM neural network.

2. Discrete-Time MAM Neural Network Model and Some Notations

In this section we formulate a discrete-time MAM neural network model with time-varying delays by employing the semidiscretization technique [8].

Let , denote the integers set, , , and where .

Let be a fixed positive real number denoting a uniform discretionary step size and denote the integer part of the real number . If , then . By replacing the time of the network (1.1) with , we can formulate the following approximation of the network (1.1): for . Denote , we rewrite (2.1) as follows: Multiplying both sides of (2.2) by , we obtain Integrating the (2.3) over , we have Letting in (2.4), we obtain If we adopt the notations then we obtain the discrete-time analogue of the continuous-time network (1.1) as follows: for , , . Obviously, .

Throughout this paper, for any , , , , we assume that the neuronal activation functions and the time delays sequences satisfy the following conditions, respectively:(H1)There exist such that for each ,(H2).

The initial conditions associated with (2.7) are of the form where , , , .

For convenience sake, set , , . For any matrixes and , we use the notation to mean that for all , , and . Let matrix , , where are zero matrixes.

3. The Existence and Global Exponential Stability of an Equilibrium Point

In this section, we will give two theorems about the existence and the global exponential stability of an equilibrium point of the discrete-time MAM neural network (2.7).

Lemma 3.1. If , , then .

Proof. Define a function (). From for , we know that the function is increasing on the interval . Therefore, for . So we have .
Define a function for again. Obviously , . From , the is increasing. Because there exists an unique critical number , we know that for . It shows that is deceasing on . So we have . In view of , we obtain for .

With a similar method of [5], we can prove the following Theorem 3.2.

Theorem 3.2. Suppose that all the neuronal activation functions (, ) are continuous and the condition (H1) holds. If is a nonsingular matrix, then there exists an equilibrium point of the discrete-time MAM neural network (2.7).

Proof. Let . Obviously, . Denote , . Because and is a nonsingular matrix, by Lemma A3 in [11], we have . That is for any , . From the definition of , we have . Therefore,
Let with a norm . Obviously, is a bounded closed compact subset. Define a function as , where From the condition (H1) and (3.1), we have Thus is a self-map from to . By Brouwer fixed-point theorem, there exists at least a , such that . That is Therefore is an equilibrium point of the MAM neural network (2.7).

Next we prove the global exponential stability of the equilibrium point of the discrete-time MAM neural network (2.7).

Theorem 3.3. Suppose that all the neuronal activation functions are continuous and the conditions (H1) and (H2) hold. If is a nonsingular matrix, then the equilibrium point of the MAM neural network (2.7) is global exponential stable.

Proof. Let be an equilibrium point for the MAM neural network (2.7), , be an arbitrary solution of (2.7). Set , where for , . Define functions where , . Obviously, and from the condition (H1), we have for any . By (3.5), the MAM neural network (2.7) is reduced to the form where , . Obviously, there exists an equilibrium point of the system (3.8). From (2.8) and (3.5), the initial conditions associated with (3.8) are of the form where , , . Let , .
Because is a nonsingular matrix, then there exist constants (, ) such that
Define the functions where , , . Apparently is strictly monotone decreasing and continuous function. In view of (3.10), it is clear that . Therefore there exist such that (, ). Taking , we have for , .
Set . By calculating along the solutions of system (3.8), we have By using the inequality (3.7), we have By Lemma 3.1, we have Let , and , where is a positive constant. Therefore, when , we have for , . We assert that for , and . If the assertion is false, then there exist , , and a minimum time such that , , and when . From (3.12) and (3.15), and noticed that , we obtain It conflicts with . Therefore, for . Then we have where . So the zero solution of the system (3.8) is global exponential stable; thus the equilibrium point of the discrete-time MAM neural network (2.7) is global exponential stable.

4. An Example

Consider the following discrete-time MAM neural network with three fields: where the neuronal signal decay rates , , , , the external inputs , , , , and connection weights , , , , , , , , , . The neuronal activation functions , the time delays .

Obviously, signal transfer functions are continuous, and they satisfy the condition (H1), and the constant . The time delays satisfy the condition (H2), and .

By calculating, we have It is easy to verify that the matrix is a nonsingular matrix. Then by Theorems 3.2 and 3.3, there exists an equilibrium point which is globally exponentially stable for the MAM neural network (4.1).

The numerical simulation is given in Figure 1 in which ten cases of initial values are taken at random. From Figure 1, we can know that the MAM neural network (4.1) converges globally exponentially to the equilibrium point no matter what it starts from the initial states.

5. Conclusions

In this paper, we have formulated a discrete-time analogue of the continuous-time multidirectional associative memory neural network with time-varying delays by using semidiscretization method. Some sufficient conditions for the existence and the global exponential stability of an equilibrium point have been obtained. Our results have shown that the discrete-time analogue inherits the existence and global exponential stability of equilibrium point for the continuous-time MAM neural network.