Abstract

The existence of equilibrium solutions to reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales is proved by the topological degree theory and M-matrix method. Under some sufficient conditions, we obtain the uniqueness and global exponential stability of equilibrium solution to reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales by constructing suitable Lyapunov functional and inequality skills. One example is given to illustrate the effectiveness of our results.

1. Introduction

In the past few years, various neural network models have been extensively investigated and successfully applied to signal processing, image processing, pattern classification, quadratic optimization, associative memory, moving object speed detection, and so forth. Such applications heavily depend on the dynamical behaviors of the neural networks. Therefore, the analysis of the dynamical behaviors is a necessary step for practical design of neural networks.

As is well known, both in biological and man-made neural networks, strictly speaking, diffusion effects cannot be avoided in the neural network models when electrons are moving in asymmetric electromagnetic fields, so we must consider that the activations vary in spaces as well as in time. References [110] have considered the stability of neural networks with diffusion terms, which are expressed by partial differential equations. It is also common to consider the diffusion effects in biological systems (such as immigration, see, e.g., [1113]). For more details of the literature related to models of reaction-diffusion neural networks and their applications, the reader is referred to [1421] and the references cited therein.

In fact, both continuous and discrete systems are very important in implementing and applications. But it is troublesome to study the dynamics behavior for continuous and discrete systems, respectively. Therefore, it is meaningful to study that on time scales which can unify the continuous and discrete situations [22, 23].

To the best of our knowledge, few authors have considered global exponential stability of reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales, which is a very important in theories and applications and also is very challenging problem. Motivated by the above discussion, in this paper, we will investigate the global exponential stability of the following reaction-diffusion recurrent neural network with initial value conditions and Dirichlet boundary conditions on time scales: where is a time scale and is unbounded, is the number of neurons in the networks, , and , is a bounded compact set with smooth boundary in space , , and is the state of the th neurons at time and in space . Smooth function corresponds to the transmission diffusion operator along with the th unit, represents the rate with which the th unit will reset its potential to the resting state in isolation when disconnected from the network and external inputs, denotes the strength of the th unit on the th unit at time and in space , is the synaptic connection strength of th unit on th unit at time and in space , denotes the activation function of the th unit at time and in space , , and is a constant input vector.

2. Preliminaries

In this section, we first recall some basic definitions and lemmas on time scales which are used in what follows.

Let be a nonempty closed subset (time scale) of . The forward and backward jump operators and the graininess are defined, respectively, by

A point is called left-dense if and , left-scattered if , right-dense if and , and right-scattered if . If has a left-scattered maximum , then ; otherwise . If has a right-scattered minimum , then ; otherwise .

Definition 2.1 (see [24]). A function is called regulated provided its right-side limits exist (finite) at all right-side points in and its left-side limits exist (finite) at all left-side points in

Definition 2.2 (see [24]). A function is called continuous provided it is continuous at right-dense point in and its left-side limits exist (finite) at left-dense points in . The set of continuous functions will be denoted by

Definition 2.3 (see [24]). Assume and . Then we define to be the number (if it exists) with the property that given any there exists a neighborhood of (i.e., for some ) such that for all . We call the delta (or Hilger) derivative of at The set of functions that are differentiable and whose derivative is continuous is denoted by

If is continuous, then is continuous. If is continuous, the is regulated. If is delta differentiable at , then is continuous at

Lemma 2.4 (see [24]). Let be regulated, then there exists a function which is delta differentiable with region of differentiation such that

Definition 2.5 (see [24]). Assume is a regulated function. Any function as in Lemma 2.4 is called a -antiderivative of We define the indefinite integral of a regulated function by where is an arbitrary constant and is a -antiderivative of We define the Cauchy integral by A function is called an antiderivative of provided that

Lemma 2.6 (see [24]). If , , and , then (i);(ii)if for all , then ;(iii)if on , then .

A function is called regressive if for all The set of all regressive and rd-continuous functions will be denoted by We define the set of all positively regressive elements of by If is a regressive function, then the generalized exponential function is defined by for with the cylinder transformation

Let be two regressive functions, then we define

The generalized exponential function has the following properties.

Lemma 2.7 (see [24]). Assume that are two regressive functions, then, (i);(ii);(iii);(iv);(v);(vi) for ;(vii).

Lemma 2.8 (see [24]). Assume that are delta differentiable at Then,

Next, let us introduce the Banach space which is suitable for (1.1)–(1.3) and some assumed conditions which are needed in this paper.

Let , be an open bounded domain in with smooth boundary Let be the set consisting of all the vector functions which are -continuous with respect to and continuous with respect to respectively. For every and we define the set Then for every , is a Banach space with the norm where , , Obviously, is a Banach space equipped with the norm , where , ,

Definition 2.9. A function is called a solution of (1.1)–(1.3) if and only if satisfies (1.1), initial value conditions (1.2) and Dirichlet boundary conditions (1.3).

Definition 2.10. A constant vector is said to be an equilibrium solution to (1.1)–(1.3), if it satisfies ,

Definition 2.11. The equilibrium solution of recurrent neural network (1.1)–(1.3) is said to be globally exponentially stable if there exists a positive constant and such that every solution of (1.1)–(1.3) satisfies

Definition 2.12 (Lakshmikantham and Vatsala [25]). For each let be a neighborhood of Then, for define to mean that, given there exists a right neighborhood of such that for each where If is rs and is continuous at this reduces to

3. Main Results

In this section, we will consider the existence, uniqueness, and global exponential stability of equilibrium of (1.1)–(1.3). To proceed, we need the following lemma.

Lemma 3.1 (see [14]). Let be a cube and let be a real-valued function belonging to which vanish on the boundary of that is, Then,

Throughout this paper, we always assume that is Lipschitz continuous, that is, there exists constant such that for any , is an M-matrix, where , , ,

Theorem 3.2. Assume that and hold, then (1.1)–(1.3) has at least one equilibrium solution

Proof. By , it follows that Let where It is obvious that solutions to (3.3) are equilibria of (1.1)–(1.3). Let us define homotopic mapping We have That is, where , , , , and is an identity matrix.
Since is an -matrix, we have (nonnegative matrix) and there exists a such that Let Then, is not empty and it follows from (3.7) that for any (boundary of ), which implies that for , So, from homotopy invariance theorem, we have where denotes topological degree. By topological degree theory, we can conclude that (3.3) has at least one solution in That is, (1.1)–(1.3) has at least an equilibrium solution This completes the proof.

Theorem 3.3. Assume that hold, then the reaction-diffusion recurrent neural network (1.1)–(1.3) has a unique equilibrium solution which is globally exponentially stable.

Proof. The existence of equilibrium solutions for (1.1)–(1.3) follows from Theorem 3.2. Now we only need to prove the uniqueness and global exponential stability of equilibrium solutions for (1.1)–(1.3).
Suppose that and are two arbitrary solutions of (1.1)–(1.3) with conditions and define , , then is governed by the following equations: where Calculating the delta derivation of along the solution of (3.10), we have, for From Green formula [26], Dirichlet boundary condition, and Lemma 3.1, we have, for By (3.11), (3.12), condition , and Holder inequality, we get where ,
If condition holds, we can always choose a positive number (may be very small) such that for
Let us consider functions where From (3.15), we obtain that and is continuous for furthermore, as thus there exist constant such that and for By choosing obviously we have, for
Now consider the Lyapunov functional
Calculating the delta derivatives of along the solution of (3.10) and noting that if and only if (i.e., is increasing with respect to if and only if ), we have
From (3.17) and (3.18), we have, for which implies that where
Let and be two arbitrary equilibrium solutions of system (1.1)–(1.3). According to (3.20), we get here It follows that that is, the equilibrium solution of (1.1)–(1.3) is unique.
Let and be arbitrary solutions and an unique equilibrium solution of (1.1)–(1.3), respectively. In the light of (3.20), we obtain here Thus, by Definition 2.11, we obtain the global exponential stability of unique equilibrium solution of (1.1)–(1.3). The proof is complete.

4. An Illustrative Example

Example 4.1. Consider the following reaction-diffusion recurrent neural network with Dirichlet boundary conditions on time scales: where is a time scale and is unbounded, , , and is the constant input vector. Obviously, satisfies the Lipschitz condition with Let , , , , , , , , and By simple calculation, we have that is, , , and hold. Hence, it follows from Theorem 3.3 that (4.1) has one unique equilibrium solution which is globally exponentially stable.

Acknowledgment

This work is supported by the National Natural Sciences Foundation of People’s Republic of China under Grant 10971183.