Research Article | Open Access

Kaihong Zhao, Yongkun Li, "Existence and Global Exponential Stability of Equilibrium Solution to Reaction-Diffusion Recurrent Neural Networks on Time Scales", *Discrete Dynamics in Nature and Society*, vol. 2010, Article ID 624619, 12 pages, 2010. https://doi.org/10.1155/2010/624619

# Existence and Global Exponential Stability of Equilibrium Solution to Reaction-Diffusion Recurrent Neural Networks on Time Scales

**Academic Editor:**Francisco Solis

#### Abstract

The existence of equilibrium solutions to reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales is proved by the topological degree theory and M-matrix method. Under some sufficient conditions, we obtain the uniqueness and global exponential stability of equilibrium solution to reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales by constructing suitable Lyapunov functional and inequality skills. One example is given to illustrate the effectiveness of our results.

#### 1. Introduction

In the past few years, various neural network models have been extensively investigated and successfully applied to signal processing, image processing, pattern classification, quadratic optimization, associative memory, moving object speed detection, and so forth. Such applications heavily depend on the dynamical behaviors of the neural networks. Therefore, the analysis of the dynamical behaviors is a necessary step for practical design of neural networks.

As is well known, both in biological and man-made neural networks, strictly speaking, diffusion effects cannot be avoided in the neural network models when electrons are moving in asymmetric electromagnetic fields, so we must consider that the activations vary in spaces as well as in time. References [1–10] have considered the stability of neural networks with diffusion terms, which are expressed by partial differential equations. It is also common to consider the diffusion effects in biological systems (such as immigration, see, e.g., [11–13]). For more details of the literature related to models of reaction-diffusion neural networks and their applications, the reader is referred to [14–21] and the references cited therein.

In fact, both continuous and discrete systems are very important in implementing and applications. But it is troublesome to study the dynamics behavior for continuous and discrete systems, respectively. Therefore, it is meaningful to study that on time scales which can unify the continuous and discrete situations [22, 23].

To the best of our knowledge, few authors have considered global exponential stability of reaction-diffusion recurrent neural networks with Dirichlet boundary conditions on time scales, which is a very important in theories and applications and also is very challenging problem. Motivated by the above discussion, in this paper, we will investigate the global exponential stability of the following reaction-diffusion recurrent neural network with initial value conditions and Dirichlet boundary conditions on time scales: where is a time scale and is unbounded, is the number of neurons in the networks, , and , is a bounded compact set with smooth boundary in space , , and is the state of the th neurons at time and in space . Smooth function corresponds to the transmission diffusion operator along with the th unit, represents the rate with which the th unit will reset its potential to the resting state in isolation when disconnected from the network and external inputs, denotes the strength of the th unit on the th unit at time and in space , is the synaptic connection strength of th unit on th unit at time and in space , denotes the activation function of the th unit at time and in space , , and is a constant input vector.

#### 2. Preliminaries

In this section, we first recall some basic definitions and lemmas on time scales which are used in what follows.

Let be a nonempty closed subset (time scale) of . The forward and backward jump operators and the graininess are defined, respectively, by

A point is called left-dense if and , left-scattered if , right-dense if and , and right-scattered if . If has a left-scattered maximum , then ; otherwise . If has a right-scattered minimum , then ; otherwise .

*Definition 2.1 (see [24]). * A function is called *regulated* provided its right-side limits exist (finite) at all right-side points in and its left-side limits exist (finite) at all left-side points in

*Definition 2.2 (see [24]). *A function is called continuous provided it is continuous at right-dense point in and its left-side limits exist (finite) at left-dense points in . The set of continuous functions will be denoted by

*Definition 2.3 (see [24]). *Assume and . Then we define to be the number (if it exists) with the property that given any there exists a neighborhood of (i.e., for some ) such that
for all . We call the delta (or Hilger) derivative of at The set of functions that are differentiable and whose derivative is continuous is denoted by

If is continuous, then is continuous. If is continuous, the is regulated. If is delta differentiable at , then is continuous at

Lemma 2.4 (see [24]). *Let be regulated, then there exists a function which is delta differentiable with region of differentiation such that
*

*Definition 2.5 (see [24]). *Assume is a regulated function. Any function as in Lemma 2.4 is called a -antiderivative of We define the indefinite integral of a regulated function by
where is an arbitrary constant and is a -antiderivative of We define the Cauchy integral by
A function is called an antiderivative of provided that

Lemma 2.6 (see [24]). *If , , and , then *(i)*;*(ii)*if for all , then ;*(iii)*if on , then . *

A function is called regressive if for all The set of all regressive and rd-continuous functions will be denoted by We define the set of all positively regressive elements of by If is a regressive function, then the generalized exponential function is defined by for with the cylinder transformation

Let be two regressive functions, then we define

The generalized exponential function has the following properties.

Lemma 2.7 (see [24]). *Assume that are two regressive functions, then, *(i)*;*(ii)*;*(iii)*;*(iv)*;*(v)*;*(vi)* for ;*(vii)*. *

Lemma 2.8 (see [24]). *Assume that are delta differentiable at Then,
*

Next, let us introduce the Banach space which is suitable for (1.1)–(1.3) and some assumed conditions which are needed in this paper.

Let , be an open bounded domain in with smooth boundary Let be the set consisting of all the vector functions which are -continuous with respect to and continuous with respect to respectively. For every and we define the set Then for every , is a Banach space with the norm where , , Obviously, is a Banach space equipped with the norm , where , ,

*Definition 2.9. *A function is called a solution of (1.1)–(1.3) if and only if satisfies (1.1), initial value conditions (1.2) and Dirichlet boundary conditions (1.3).

*Definition 2.10. *A constant vector is said to be an equilibrium solution to (1.1)–(1.3), if it satisfies ,

*Definition 2.11. *The equilibrium solution of recurrent neural network (1.1)–(1.3) is said to be globally exponentially stable if there exists a positive constant and such that every solution of (1.1)–(1.3) satisfies

*Definition 2.12 (Lakshmikantham and Vatsala [25]). *For each let be a neighborhood of Then, for define to mean that, given there exists a right neighborhood of such that
for each where If is rs and is continuous at this reduces to

#### 3. Main Results

In this section, we will consider the existence, uniqueness, and global exponential stability of equilibrium of (1.1)–(1.3). To proceed, we need the following lemma.

Lemma 3.1 (see [14]). *Let be a cube and let be a real-valued function belonging to which vanish on the boundary of that is, Then,
*

Throughout this paper, we always assume that is Lipschitz continuous, that is, there exists constant such that for any , is an M-matrix, where , , ,

Theorem 3.2. *Assume that and hold, then (1.1)–(1.3) has at least one equilibrium solution *

*Proof. *By , it follows that
Let
where It is obvious that solutions to (3.3) are equilibria of (1.1)–(1.3). Let us define homotopic mapping
We have
That is,
where , , , , and is an identity matrix.

Since is an -matrix, we have (nonnegative matrix) and there exists a such that Let
Then, is not empty and it follows from (3.7) that for any (boundary of ),
which implies that for , So, from homotopy invariance theorem, we have
where denotes topological degree. By topological degree theory, we can conclude that (3.3) has at least one solution in That is, (1.1)–(1.3) has at least an equilibrium solution This completes the proof.

Theorem 3.3. *Assume that hold, then the reaction-diffusion recurrent neural network (1.1)–(1.3) has a unique equilibrium solution which is globally exponentially stable.*

*Proof. *The existence of equilibrium solutions for (1.1)–(1.3) follows from Theorem 3.2. Now we only need to prove the uniqueness and global exponential stability of equilibrium solutions for (1.1)–(1.3).

Suppose that and are two arbitrary solutions of (1.1)–(1.3) with conditions and define , , then is governed by the following equations:
where Calculating the delta derivation of along the solution of (3.10), we have, for
From Green formula [26], Dirichlet boundary condition, and Lemma 3.1, we have, for
By (3.11), (3.12), condition , and Holder inequality, we get
where ,

If condition holds, we can always choose a positive number (may be very small) such that for

Let us consider functions
where From (3.15), we obtain that and is continuous for furthermore, as thus there exist constant such that and for By choosing obviously we have, for

Now consider the Lyapunov functional

Calculating the delta derivatives of along the solution of (3.10) and noting that if and only if (i.e., is increasing with respect to if and only if ), we have

From (3.17) and (3.18), we have, for
which implies that
where

Let and be two arbitrary equilibrium solutions of system (1.1)–(1.3). According to (3.20), we get here It follows that that is, the equilibrium solution of (1.1)–(1.3) is unique.

Let and be arbitrary solutions and an unique equilibrium solution of (1.1)–(1.3), respectively. In the light of (3.20), we obtain here Thus, by Definition 2.11, we obtain the global exponential stability of unique equilibrium solution of (1.1)–(1.3). The proof is complete.

#### 4. An Illustrative Example

*Example 4.1. *Consider the following reaction-diffusion recurrent neural network with Dirichlet boundary conditions on time scales:
where is a time scale and is unbounded, , , and is the constant input vector. Obviously, satisfies the Lipschitz condition with Let , , , , , , , , and By simple calculation, we have
that is, , , and hold. Hence, it follows from Theorem 3.3 that (4.1) has one unique equilibrium solution which is globally exponentially stable.

#### Acknowledgment

This work is supported by the National Natural Sciences Foundation of People’s Republic of China under Grant 10971183.

#### References

- Q. Song and J. Cao, “Global exponential stability and existence of periodic solutions in BAM networks with delays and reaction-diffusion terms,”
*Chaos, Solitons and Fractals*, vol. 23, no. 2, pp. 421–430, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - H. Zhao and G. Wang, “Existence of periodic oscillatory solution of reaction-diffusion neural networks with delays,”
*Physics Letters A*, vol. 343, no. 5, pp. 372–383, 2005. View at: Publisher Site | Google Scholar | MathSciNet - Q. Song, J. Cao, and Z. Zhao, “Periodic solutions and its exponential stability of reaction-diffusion recurrent neural networks with continuously distributed delays,”
*Nonlinear Analysis: Theory, Methods & Applications*, vol. 7, no. 1, pp. 65–80, 2006. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - Z. J. Zhao, Q. K. Song, and J. Y. Zhang, “Exponential periodicity and stability of neural networks with reaction-diffusion terms and both variable and unbounded delays,”
*Computers & Mathematics with Applications*, vol. 51, no. 3-4, pp. 475–486, 2006. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - J. Liang and J. Cao, “Global exponential stability of reaction-diffusion recurrent neural networks with time-varying delays,”
*Physics Letters A*, vol. 314, no. 5-6, pp. 434–442, 2003. View at: Publisher Site | Google Scholar | Zentralblatt MATH - Q. Song, Z. Zhao, and Y. Li, “Global exponential stability of BAM neural networks with distributed delays and reaction-diffusion terms,”
*Physics Letters A*, vol. 335, no. 2-3, pp. 213–225, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH - B. T. Cui and X. Y. Lou, “Global asymptotic stability of BAM neural networks with distributed delays and reaction-diffusion terms,”
*Chaos, Solitons and Fractals*, vol. 27, no. 5, pp. 1347–1354, 2006. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - L. Wang and Y. Gao, “Global exponential robust stability of reaction-diffusion interval neural networks with time-varying delays,”
*Physics Letters A*, vol. 350, no. 5-6, pp. 342–348, 2006. View at: Publisher Site | Google Scholar | MathSciNet - J. Sun and L. Wan, “Convergence dynamics of stochastic reaction-diffusion recurrent neural networks with delays,”
*International Journal of Bifurcation and Chaos*, vol. 15, no. 7, pp. 2131–2144, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - L. Wang and D. Xu, “Asymptotic behavior of a class of reaction-diffusion equations with delays,”
*Journal of Mathematical Analysis and Applications*, vol. 281, no. 2, pp. 439–453, 2003. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - X.-X. Liao and J. Li, “Stability in Gilpin-Ayala competition models with diffusion,”
*Nonlinear Analysis: Theory, Methods & Applications*, vol. 28, no. 10, pp. 1751–1758, 1997. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - A. Hastings, “Global stability in Lotka-Volterra systems with diffusion,”
*Journal of Mathematical Biology*, vol. 6, no. 2, pp. 163–168, 1978. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - F. Rothe, “Convergence to the equilibrium state in the Volterra-Lotka diffusion equations,”
*Journal of Mathematical Biology*, vol. 3, no. 3-4, pp. 319–324, 1976. View at: Google Scholar | Zentralblatt MATH | MathSciNet - J. G. Lu, “Global exponential stability and periodicity of reaction-diffusion delayed recurrent neural networks with Dirichlet boundary conditions,”
*Chaos, Solitons and Fractals*, vol. 35, no. 1, pp. 116–125, 2008. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - J. G. Lu and L. J. Lu, “Global exponential stability and periodicity of reaction-diffusion recurrent neural networks with distributed delays and Dirichlet boundary conditions,”
*Chaos, Solitons and Fractals*, vol. 39, no. 4, pp. 1538–1549, 2009. View at: Publisher Site | Google Scholar | MathSciNet - J. Wang and J. G. Lu, “Global exponential stability of fuzzy cellular neural networks with delays and reaction-diffusion terms,”
*Chaos, Solitons and Fractals*, vol. 38, no. 3, pp. 878–885, 2008. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - J. Pan, X. Liu, and S. Zhong, “Stability criteria for impulsive reaction-diffusion Cohen-Grossberg neural networks with time-varying delays,”
*Mathematical and Computer Modelling*, vol. 51, no. 9-10, pp. 1037–1050, 2010. View at: Publisher Site | Google Scholar - J. Pan and S. Zhong, “Dynamical behaviors of impulsive reaction-diffusion Cohen-Grossberg neural network with delays,”
*Neurocomputing*, vol. 73, no. 7–9, pp. 1344–1351, 2010. View at: Publisher Site | Google Scholar - A. Wu and C. Fu, “Global exponential stability of non-autonomous FCNNs with Dirichlet boundary conditions and reaction-diffusion terms,”
*Applied Mathematical Modelling*, vol. 34, no. 10, pp. 3022–3029, 2010. View at: Publisher Site | Google Scholar - J. Pan and Y. Zhan, “On periodic solutions to a class of non-autonomously delayed reaction-diffusion neural networks,”
*Communications in Nonlinear Science and Numerical Simulation*. In press. View at: Publisher Site | Google Scholar - T. Ensari and S. Arik, “New results for robust stability of dynamical neural networks with discrete time delays,”
*Expert Systems with Applications*, vol. 37, no. 8, pp. 5925–5930, 2010. View at: Publisher Site | Google Scholar - Y. Li, Y. Hua, and Y. Fei, “Global exponential stability of delayed Cohen-Grossberg BAM neural networks with impulses on time scales,”
*Journal of Inequalities and Applications*, vol. 2009, Article ID 491268, 17 pages, 2009. View at: Google Scholar | Zentralblatt MATH | MathSciNet - Y. Li, L. Zhao, and P. Liu, “Existence and exponential stability of periodic solution of high-order Hopfield neural network with delays on time scales,”
*Discrete Dynamics in Nature and Society*, vol. 2009, Article ID 573534, 18 pages, 2009. View at: Google Scholar | Zentralblatt MATH | MathSciNet - M. Bohner and A. Peterson,
*Dynamic Equations on Time Scales. An Introduction with Application*, Birkhäuser Boston, Boston, Mass, USA, 2001. View at: MathSciNet - V. Lakshmikantham and A. S. Vatsala, “Hybrid systems on time scales,”
*Journal of Computational and Applied Mathematics*, vol. 141, no. 1-2, pp. 227–235, 2002. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - S. Omatu and J. H. Seinfeld,
*Distributed Parameter Systems. Theory and Applications, Oxford Science Publications*, Oxford Mathematical Monographs, The Clarendon Press, Oxford University Press, New York, NY, USA, 1989. View at: MathSciNet

#### Copyright

Copyright © 2010 Kaihong Zhao and Yongkun Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.