- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Abstract and Applied Analysis

Volume 2013 (2013), Article ID 630623, 13 pages

http://dx.doi.org/10.1155/2013/630623

## Robust Almost Periodic Dynamics for Interval Neural Networks with Mixed Time-Varying Delays and Discontinuous Activation Functions

Department of Applied Mathematics, Yanshan University, Qinhuangdao 066001, China

Received 2 March 2013; Accepted 16 May 2013

Academic Editor: Zidong Wang

Copyright © 2013 Huaiqin Wu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The robust almost periodic dynamical behavior is investigated for interval neural networks with mixed time-varying delays and discontinuous activation functions. Firstly, based on the definition of the solution in the sense of Filippov for differential equations with discontinuous right-hand sides and the differential inclusions theory, the existence and asymptotically almost periodicity of the solution of interval network system are proved. Secondly, by constructing appropriate generalized Lyapunov functional and employing linear matrix inequality (LMI) techniques, a delay-dependent criterion is achieved to guarantee the existence, uniqueness, and global robust exponential stability of almost periodic solution in terms of LMIs. Moreover, as special cases, the obtained results can be used to check the global robust exponential stability of a unique periodic solution/equilibrium for discontinuous interval neural networks with mixed time-varying delays and periodic/constant external inputs. Finally, an illustrative example is given to demonstrate the validity of the theoretical results.

#### 1. Introduction

In the past few decades, there was an increasing interest in different classes of neural networks such as Hopfield, cellular, Cohen-Grossberg, and bidirectional associative neural networks due to their potential applications in many areas such as classification, signal and image processing, parallel computing, associate memories, optimization, and cryptography. In the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role; for example, to solve problems of optimization, neural control, and signal processing, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. Hence, exploring the global stability of neural networks is of primary importance.

In recent years, the global stability of neural networks with discontinuous activations has received extensive attention from a lot of scholars under the Filippov framework, see, for example, [1–29] and references therein. In [1], Forti and Nistri firstly dealt with the global asymptotic stability (GAS) and global convergence in finite time of a unique equilibrium point for neural networks modeled by a differential equations with discontinuous right-hand sides, and by using Lyapunov diagonally stable (LDS) matrix and constructing suitable Lyapunov function, several stability conditions were derived. In [2, 3], by applying generalized Lyapunov approach and -matrix, Forti et al. discussed the global exponential stability (GES) of neural networks with discontinuous or non-Lipschitz activation functions. Arguing as in [1], in [4], Lu and Chen dealt with GES and GAS of Cohen-Grossberg neural networks with discontinuous activation functions. In [5–11], by using differential inclusion and Lyapunov functional approach, a series of results has been obtained for the global stability of the unique equilibrium point of neural networks with a single constant time-delay and discontinuous activations. In [12], under the framework of Filippov solutions, by using matrix measure approach, Liu et al. investigated the global dissipativity and quasi synchronization for the time-varying delayed neural networks with discontinuous activations and parameter mismatches. In [13], similar to the method employed in [12], Liu et al. discussed the quasi-synchronization control issue of switched complex networks.

It is well known that equilibrium point can be regarded as a special case of periodic solution for a neuron system with arbitrary period or zero amplitude. Hence, through the study on periodic solution, more general results can be obtained than those of the study on equilibrium point for a neuron system. Recently, at the same time to study the global stability of the equilibrium point of neural networks with discontinuous activation functions, much attention has been paid to deal with the stability of periodic solution for various neural network systems with discontinuous activations (see [15–29]). Under the influence of Forti and Nistri, in [15], Chen et al. considered the global convergence in finite time toward a unique periodic solution for Hopfield neural networks with discontinuous activations. In [16, 17], the authors explored the periodic dynamical behavior of neural networks with time-varying delays and discontinuous activation functions; some conditions were proposed to ensure the existence and GES of the unique periodic solution. In [17–23], under the Filippov inclusion framework, by using Leray-Schauder alternative theorem and Lyapunov approach, the authors presented some conditions on the existence and GES or GAS of the unique periodic solution for Hopfield neural networks or BAM neural networks with discontinuous activation functions. In [24], take discontinuous activations as an example, Cheng et al. presented the existence of anti-periodic solutions of discontinuous neural networks. In [25, 26], Wu et al. discussed the existence and GES of the unique periodic solution for neural networks with discontinuous activation functions under impulsive control. In [28, 29], under the framework of Filippov solutions, by using Lyapunov approach and -matrix, the authors presented the stability results of periodic solution for delayed Cohen-Grossberg neural networks with a single constant time-delay and discontinuous activation functions.

It should be pointed out that the results reported in [1–29] are concerned with the stability analysis of equilibrium point or periodic solution and neglect the effect of almost periodicity for neural networks with discontinuous activation functions. However, the almost periodicity is one of the basic properties for dynamical neural systems and appears to retrace their paths through phase space, but not exactly. Meantime, almost periodic functions, with a superior spatial structure, can be regarded as a generalization of periodic functions. In practice, as shown in [30, 31], almost periodic phenomenon is more common than periodic phenomenon, and almost periodic oscillatory behavior is more accordant with reality. Hence, exploring the global stability of almost periodic solution of dynamical neural systems is of primary importance. Very recently, under the framework of the theory of Filippov differential inclusions, Allegretto et al. proved the common asymptotic behavior of almost periodic solution for discontinuous, delayed and impulsive neural networks in [30]. In [31, 32], Lu and Chen, Qin et al. discussed the existence and uniqueness of almost periodic solution (as well as its global exponential stability) of delayed neural networks with almost periodic coefficients and discontinuous activations. In [33], Wang and Huang studied the almost periodicity for a class of delayed Cohen-Grossberg neural networks with discontinuous activations. It should be noted that the network model explored in [30–33] is a class of discontinuous neural networks with a single constant time-delay, and the stability conditions were achieved by using Lyapunov diagonally stable matrix or -matrix. Compared with the stability conditions expressed in terms of LMIs, it is obvious that the results obtained in [30–33] are very conservative.

In hardware implementation of the neural networks, due to unavoidable factors, such as modeling error, external perturbation, and parameter fluctuation, the neural networks model certainly involves uncertainties such as perturbations and component variations, which will change the stability of neural networks. Therefore, it is of great importance to study the global robust stability of neural networks with time-varying delay. Generally speaking, two kinds of parameter uncertainty, the interval uncertainty and the norm-bounded uncertainty, are considered frequently at present. In [34, 35], based on Lyapunov stability theory and matrix inequality analysis techniques, the global robust stability of a unique equilibrium point for neural networks with norm-bounded uncertainties and discontinuous neuron activations has been discussed. In [36], Guo and Huang analyzed the global robust stability for interval neural networks with discontinuous activations. In [37], Liu and Cao discussed the robust state estimation issue for time-varying delayed neural networks with discontinuous activation functions via differential inclusions, and some criteria have been established to guarantee the existence of robust state estimator.

It should be noted that, in the above literatures [34–36], almost all results treated of the robust stability of equilibrium point for neural networks with parameter uncertainty and discontinuous neuron activations. Moreover, most of the above-mentioned results deal with only discrete time delays. Forti et al. pointed out that it would be interesting to investigate discontinuous neural networks with more general delays, such as time-varying or distributed ones. For example, in electronic implementation of analog neural networks, the delays between neurons are usually time varying and sometimes vary violently with time due to the finite switching speed of amplifiers and faults in the electrical circuit. This motivates us to consider more general types of delays, such as discrete time-varying and distributed ones, which are in general more complex and, therefore, more difficult to be dealt with. To the best of our knowledge, up to now, only a few researchers dealt with the global robust stability issue for almost periodic solution of discontinuous neural networks with mixed time-varying delays, which motivates the work of this paper.

In this paper, our aim is to study the delay-dependent robust exponential stability problem for almost periodic solution of interval neural networks with mixed time-varying delays and discontinuous activation functions. Under the framework of Filippov differential inclusions, by applying the nonsmooth Lyapunov stability theory and employing the highly efficient LMI approach, a new delay-dependent criterion is presented to ensure the existence and global robust exponentially stability of almost periodic solution in terms of LMIs. Moreover, the obtained conclusion is applied to prove the existence and robust stability of periodic solution (or equilibrium point) for neural networks with mixed time-varying delays and discontinuous activations.

For convenience, some notation, are introduced as follows. denotes the set of real numbers, denotes the -dimensional Euclidean space, and denotes the set of all real matrices. For any matrix , means that is positive definite (negative definite). denotes the inverse of . denotes the transpose of . and denote the maximum and minimum eigenvalue of , respectively. denotes the identity matrix with compatible dimensions. The ellipsis “” denotes the transposed elements in symmetric positions. Given the vectors , , , . denotes the 2-norm of ; that is, , where denotes the spectral radius of . For , denotes the family of continuous function from to with the norm . denotes the derivative of .

Given a set , denotes the closure of the convex hull of ; denotes the collection of all nonempty, closed, and convex subsets of .

Let be a locally Lipschitz continuous function. Clarke’s generalized gradient [38] of at is defined by where is the set of Lebesgue measure zero where does not exist and is an arbitrary set with measure zero.

Let . A set-valued map is said to be measurable, if, for all , -valued function is measurable. This definition of measurability is equivalent to saying that (graph measurability), where is the Lebesgue -field of and is the Borel -field of .

Let , be Hausdorff topological spaces and . We say that the set-valued map is upper semicontinuous, if, for all nonempty closed subset of , is closed in .

The set-valued map is said to have a closed (convex, compact) image if, for each , is closed (convex, compact).

The rest of this paper is organized as follows. In Section 2, the model formulation and some preliminaries are given. In Section 3, the existence and asymptotically almost periodic behavior of Filippov solutions are analyzed. Moreover, the proof of the existence of almost periodic solution is given. The global robust exponential stability is discussed, and a delay-dependent criterion is established in terms of LMIs. In Section 4, a numerical example is presented to demonstrate the validity of the proposed results. Some conclusions are drawn in Section 5.

#### 2. Model Description and Preliminaries

Consider the following interval neural network model with discrete and distributed time delays: where is the vector of neuron states at time , is an diagonal matrix, , , are the neuron self-inhibition, , , and are real connection weight matrices representing the weighting coefficients of the neurons, , , , represent the neuron input-output activations, is a real vector function representing the external inputs of the neuron at time , and the functions and and denote the discrete and distributed time-varying delays, respectively, satisfying We have , , , , , , , , , and , , .

The activation function satisfies the following assumption. : (1) , , is piecewise continuous; that is, is continuous in except a countable set of jump discontinuous points and in every compact set of has only a finite number of jump discontinuous points. (2) , , is nondecreasing.

System (3) can be equivalently written as where , , where denotes the column vector with th element to be 1 and others to be 0.

Under assumption , is undefined at the points where is discontinuous, and , where , . System (3) is a differential equation with discontinuous right-hand side. For system (3), we adopt the following definition of the solution in the sense of Filippov [39].

*Definition 1. *A function , is a solution of system (3) on if(1) is continuous on and absolutely continuous on ;(2) satisfies

where .

By the assumption , it is easy to check that is an upper semicontinuous set-valued map with nonempty, compact, and convex values. Hence, is measurable [40]. By the measurable selection theorem, if is a solution of system (3), then there exists a measurable function such that and for a.a. .

The function in (8) is called an output solution associated with the state variable and represents the vector of neural network outputs.

*Definition 2. *For any continuous function and any measurable selection , such that for a.a. . An absolute continuous function associated with a measurable function is said to be a solution of the initial value problem (IVP) for system (3) on ( might be ) with initial value , , if

*Definition 3 (see [41]). * A continuous function is said to be almost periodic on if, for any scalar , there exist scalars and in any interval with the length of , such that for all .

*Definition 4. *The almost periodic solution of interval neural network (3) is said to be global robust exponentially stable if, for any , , , , there exist scalars and , such that
where is the solution of system (3) with initial value , and is called as the exponential convergence rate.

Lemma 5 (chain rule [38]). *If is C-regular and is absolutely continuous on any compact interval of , then and are differential for a.a. , and
*

Lemma 6 (Jensen’s inequality [17]). *For any constant matrix , any scalars and with and a vector function such that the integrals are concerned as well defined, then
*

Lemma 7 (see [42]). *Given any real matrices , , of appropriate dimensions and a scalar , if , then the following inequality holds:
*

Lemma 8 (see [35]). *Let , , and be real matrices of appropriate dimension with satisfying , then
**
for all , if and only if there exists a positive constant , such that
*

Lemma 9 (see [36]). *For any , , one has
**
where , , , . *

Lemma 10 (see [43]). *For sequence , if there exists , such that , and , a.e. , then , and
*

Before proceeding to the main results, the following assumptions need further to be made.:, , and are continuous functions and possess almost periodic property that is, for any , there exist and in any interval with the length of , such that : For any , , , there exists constant , such that : For a given constant , there exist positive matrices , , and and a positive definite diagonal matrix , such that where , , , , , .

#### 3. Main Results

Theorem 11. *Suppose that assumptions , , and are satisfied. Then interval neural network system (3) has a solution of IVP on for any initial value , . *

* Proof. *For any initial value , , similar to the proof of Lemma 1 in [2], under the assumptions , system (3) has a local solution associated with a measurable function with initial value , on , where or , and is the maximal right-side existence interval of the local solution.

Consider the following Lyapunov functional candidate:
By Lemma 5, calculating the time derivative of along the local solution of system (3) on , it yields
Without loss of generality, we can suppose that . In fact, if this is not the case, set , . Then system (8) could be equivalently changed as
where , for a.a. , and . It is obvious that . In fact, we can choose a sufficiently small constant , under the assumption and , such that
Using Lemmas 6 and 7, we can obtain that
where , , , , .

can be rearranged as
where , , , , , , , and .

In view of Lemma 8, is equivalent to By the Schur complement, is equivalent to , so the LMI is also equivalent to . This implies that
By the assumption , is bounded for . Hence, there exists a constant such that
It follows that
Integrating both sides of (31) from to , , it follows that
In view of the definition of in (21) and the fact that all the terms in are not negative, we have
Combining (32) and (33), it is easy to obtain
Therefore, . By the viability theorem in differential inclusions theory [40], one yields . That is, system (3) has a solution of IVP on for any initial value. The proof is completed.

Theorem 12. *Suppose that the assumptions are satisfied. Then the solution of IVP of interval neural network system (3) is asymptotically almost periodic. *

* Proof. *Let be a solution of IVP of system (3) associated with a measurable function with initial value , . Set , we have
where
Consider a Lyapunov functional candidate as
Calculating the time derivative of along trajectories of system (35), similar to the proof of Theorem 11, we can get
From the proof of Theorem 11, we can get that is bounded. Consequently, is also bounded. Define , . By the assumption and Lemma 9, there exist positive constants and , such that
Therefore, by using the assumption , it is easy to obtain that, for any , there exist and in any interval with the length of , such that
This implies that
By combining (37) and (41), we have
Therefore, there exists , such that for any , , that is, . This shows that any solution of system (3) is asymptotically almost periodic. The proof is complete.

*Remark 13. *In the proof of Theorem 12, the assumption plays an important role. Under this assumption, can be ensured.

Theorem 14. *If the assumptions hold, then interval neural network system (3) has a unique almost periodic solution which is global robust exponentially stable. *

* Proof. *Firstly, we prove the existence of the almost periodic solution for interval neural network system (3).

By Theorem 12, for any initial value , , interval neural network (3) has a solution which is asymptotically almost periodic. Let be any solution of system (3) associated with a measurable function with the initial value , . Then
for a.a. .

By using (40), we can pick a sequence satisfying and , for all , where is defined in (36). In addition, the sequence is equicontinuous and uniformly bounded. By Arzela-Ascoli theorem and diagonal selection principle, we can select a subsequence of (still denoted by ), such that uniformly converges to a absolute continuous function on any compact set of .

On the other hand, since and is bounded by the boundedness of , the sequence is bounded. Hence, we can also select a subsequence of (still denoted by ), such that converges to a measurable function for any . According to the fact that(i) is an upper semicontinuous set-valued map,(ii)for , as ,

we can get that for any , there exists , such that for and , where is an -dimensional unit ball. Hence, the fact implies that . On the other hand, since is a compact subset of , we have . Noting the arbitrariness of , it follows that for a.a. .

By Lebesgue’s dominated convergence theorem (Lemma 10),
for any and . This implies that is a solution of system (3).

Notice that is asymptotically almost periodic. Then, for any , there exist , , and in any interval with the length of , such that , for all . Therefore, there exists a constant , when , , for any . Let , it follows that , for any . This shows that is an almost periodic solution of system (3).

Secondly, we prove that the almost periodic solution of interval neural network system (3) is global robust exponentially stable.

Let be an arbitrary, solution and let be an almost solution of interval neural network system (3) associated with outputs and . Consider the change of variables , which transforms (3) into the differential equation
where is measurable, , and .

Similar to in (21), define a Lyapunov functional candidate as
Calculating the derivative of along the solution of system (45), similar to the proof of Theorem 11, we have
where . Combining (46) and (47) gives
This means that the almost periodic solution of interval neural network system (3) is global robust exponentially stable. Consequently, the almost periodic solution of system (3) is unique. The proof is complete.

*Remark 15. *As far as we know, all the existing results concerning the almost periodic dynamical behaviors of neural networks with discontinuous activation functions [30–33] have not considered the global robust exponential stability performance. In this paper, by constructing appropriate generalized Lyapunov functional, we have obtained a delay-dependent criterion, which guarantee the existence, uniqueness, and global robust exponential stability of almost periodic solution. Moreover, the given result is formulated by LMIs, which can be easily verified by the existing powerful tools, such as the LMI toolbox of MATLAB. Therefore, results of this paper improve corresponding parts of those in [30–33].

*Remark 16. *In [34–36], some criteria on the robust stability of an equilibrium point for neural networks with discontinuous activation functions have been given. Compared to the main results in [34–36], our results make the following improvements.(1)In [34, 35], the activation function is assumed to be monotonic nondecreasing and bounded. However, from the assumption , we can see that the activation function can be unbounded.(2)Although the assumption of boundedness was dropped in [36], the monotonic nondecreasing and the growth condition were indispensable. In this paper, the activation function is only assumed to be monotonic nondecreasing.(3)In contrast to the models in [34–36], distributed time-varying delays are considered in this paper. If we choose and , then the models in these papers are the special cases of our model.

Notice that periodic function can be regarded as a special almost periodic function. Hence, based on Theorems 11 and 14, we can obtain the following.

Corollary 17. *Suppose that , , and are periodic functions, if the assumptions , , and are satisfied. Then*(1)*neural network system (3) has a solution of IVP on for any initial value , ,*(2)*neural network system (3) has a unique periodic solution which is global robust exponentially stable. *

When is a constant external input , system (3) changes as Since a constant function can be also regarded as a special almost periodic function, by applying Theorems 11 and 14, we can obtain

Corollary 18. *If the assumptions , , and are satisfied, then*(1)*Neural network system (49) has a solution of IVP on for any initial value , .*(2)*Neural network system (49) has a unique equilibrium point which is global robust exponentially stable. *

#### 4. Illustrative Example

*Example 1. *Consider the third-order interval neural network (3) with the following system parameters:
Set , , and . It is easy to check that assumptions hold and , , , and .

Let . Solving the LMI in by using appropriate LMI solver in the MATLAB, the feasible positive definite matrices , , and and positive definite diagonal matrix could be as
and the assumption is also satisfied. Hence, it follows from Theorems 11–14 that system (3) with parameter ranges given above has a unique almost periodic solution which is global robust exponentially stable.

In view of Corollary 17, when the external input is a periodic function, this neural network has a unique periodic solution which is global robust exponentially stable, as well as the similar result of an equilibrium for the system with constant input.

As a special case, we choose the system as follows:

Figures 1 and 2 display the state trajectories of this neural network with initial value , when . It can be seen that these trajectories converge to a unique periodic. This is in accordance with the conclusion of Corollary 17. Figure 3 displays the state trajectories of this neural network with initial values , when . It can be seen that these trajectories converge to a unique equilibrium point. This is in accordance with the conclusion of Corollary 18.